You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero/gemini
Hongxin Liu a15ab139ad
[plugin] support get_grad_norm (#6115)
3 weeks ago
..
chunk [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2 months ago
memory_tracer [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
gemini_ddp.py [checkpointio] fix hybrid plugin model save (#6106) 4 weeks ago
gemini_hook.py [gemini] quick fix on possible async operation (#5803) 6 months ago
gemini_mgr.py [chore] remove unnecessary assert since compute list might not be recorded 6 months ago
gemini_optimizer.py [plugin] support get_grad_norm (#6115) 3 weeks ago
placement_policy.py [misc] fit torch api upgradation and remove legecy import (#6093) 1 month ago
utils.py [npu] change device to accelerator api (#5239) 11 months ago