ColossalAI/colossalai
Baizhou Zhang 21ba89cab6
[gemini] support gradient accumulation (#4869)
* add test

* fix no_sync bug in low level zero plugin

* fix test

* add argument for grad accum

* add grad accum in backward hook for gemini

* finish implementation, rewrite tests

* fix test

* skip stuck model in low level zero test

* update doc

* optimize communication & fix gradient checkpoint

* modify doc

* cleaning codes

* update cpu adam fp16 case
2023-10-17 14:07:21 +08:00
..
_C [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
_analyzer [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
amp [feature] Add clip_grad_norm for hybrid_parallel_plugin (#4837) 2023-10-12 11:32:37 +08:00
auto_parallel [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
autochunk [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
booster [gemini] support gradient accumulation (#4869) 2023-10-17 14:07:21 +08:00
checkpoint_io [checkpointio] hotfix torch 2.0 compatibility (#4824) 2023-10-07 10:45:52 +08:00
cli [bug] Fix the version check bug in colossalai run when generating the cmd. (#4713) 2023-09-22 10:50:47 +08:00
cluster [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00
context [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
device [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
fx [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
inference [inference] Add smmoothquant for llama (#4904) 2023-10-16 11:28:44 +08:00
interface [lazy] support from_pretrained (#4801) 2023-09-26 11:04:11 +08:00
kernel [kernel] support pure fp16 for cpu adam and update gemini optim tests (#4921) 2023-10-16 21:56:53 +08:00
lazy [doc] add lazy init docs (#4808) 2023-09-27 10:24:04 +08:00
legacy [bug] fix get_default_parser in examples (#4764) 2023-09-21 10:42:25 +08:00
logging [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
nn [kernel] support pure fp16 for cpu adam and update gemini optim tests (#4921) 2023-10-16 21:56:53 +08:00
pipeline [Pipeline Inference] Sync pipeline inference branch to main (#4820) 2023-10-11 11:40:06 +08:00
shardformer [infer] fix test bug (#4838) 2023-10-04 10:01:03 +08:00
tensor [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
testing [gemini] support amp o3 for gemini (#4872) 2023-10-12 10:39:08 +08:00
utils [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
zero [gemini] support gradient accumulation (#4869) 2023-10-17 14:07:21 +08:00
__init__.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
initialize.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00