You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
ver217 a45ddf2d5f
[hotfix] fix sharded optim step and clip_grad_norm (#1226)
2 years ago
..
init_ctx [hotfix] fix zero init ctx numel (#1128) 2 years ago
shard_utils [gemini] add GeminiMemoryManger (#832) 3 years ago
sharded_model warmup ratio configration (#1192) 2 years ago
sharded_optim [hotfix] fix sharded optim step and clip_grad_norm (#1226) 2 years ago
sharded_param [gemini] add GeminiMemoryManger (#832) 3 years ago
utils [refactor] move chunk and chunkmgr to directory gemini (#1182) 2 years ago
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 3 years ago
zero_optimizer.py [zero] zero optim supports loading local state dict (#1171) 2 years ago