You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero/gemini
Baizhou Zhang d99b2c961a
[hotfix] fix grad accumulation plus clipping for gemini (#5002)
1 year ago
..
chunk [hotfix] fix grad accumulation plus clipping for gemini (#5002) 1 year ago
memory_tracer [misc] update pre-commit and run all files (#4752) 1 year ago
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
colo_init_context.py [misc] update pre-commit and run all files (#4752) 1 year ago
gemini_ddp.py [hotfix] fix grad accumulation plus clipping for gemini (#5002) 1 year ago
gemini_hook.py [misc] update pre-commit and run all files (#4752) 1 year ago
gemini_mgr.py [misc] update pre-commit and run all files (#4752) 1 year ago
gemini_optimizer.py [gemini] support gradient accumulation (#4869) 1 year ago
placement_policy.py [misc] update pre-commit and run all files (#4752) 1 year ago
utils.py [gemini] support amp o3 for gemini (#4872) 1 year ago