ColossalAI/docs/source/zh-Hans/features
Baizhou Zhang 21ba89cab6
[gemini] support gradient accumulation (#4869)
* add test

* fix no_sync bug in low level zero plugin

* fix test

* add argument for grad accum

* add grad accum in backward hook for gemini

* finish implementation, rewrite tests

* fix test

* skip stuck model in low level zero test

* update doc

* optimize communication & fix gradient checkpoint

* modify doc

* cleaning codes

* update cpu adam fp16 case
2023-10-17 14:07:21 +08:00
..
1D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
2D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
2p5D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
3D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
cluster_utils.md [doc] add booster docstring and fix autodoc (#3789) 2023-05-22 10:56:47 +08:00
gradient_accumulation_with_booster.md [gemini] support gradient accumulation (#4869) 2023-10-17 14:07:21 +08:00
gradient_clipping_with_booster.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
lazy_init.md [doc] add lazy init docs (#4808) 2023-09-27 10:24:04 +08:00
mixed_precision_training_with_booster.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
nvme_offload.md [doc] update nvme offload documents. (#3850) 2023-05-26 01:22:01 +08:00
pipeline_parallel.md [shardformer] update pipeline parallel document (#4725) 2023-09-15 14:32:04 +08:00
shardformer.md [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00
zero_with_chunk.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00