You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
Zihao 20e255d4e8
MemStatsCollectorStatic (#1765)
2 years ago
..
init_ctx [moe] fix MoE bugs (#1628) 2 years ago
shard_utils [gemini] add GeminiMemoryManger (#832) 3 years ago
sharded_model MemStatsCollectorStatic (#1765) 2 years ago
sharded_optim fix move fp32 shards (#1604) 2 years ago
sharded_param [NFC] polish colossalai/zero/sharded_param/__init__.py code style (#1717) 2 years ago
utils [zero] add constant placement policy (#1705) 2 years ago
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 3 years ago
zero_optimizer.py [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago