You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero/gemini
hxwang ff507b755e
Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch
6 months ago
..
chunk Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch 6 months ago
memory_tracer [npu] change device to accelerator api (#5239) 11 months ago
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
gemini_ddp.py Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch 6 months ago
gemini_hook.py [bug] fix early return (#5740) 6 months ago
gemini_mgr.py [bug] fix early return (#5740) 6 months ago
gemini_optimizer.py [gemini] async grad chunk reduce (all-reduce&reduce-scatter) (#5713) 6 months ago
placement_policy.py refactor the code structure to solve the circular import 6 months ago
utils.py [npu] change device to accelerator api (#5239) 11 months ago