You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
ver217 7a05367101
[hotfix] shared model returns cpu state_dict (#1328)
2 years ago
..
init_ctx [hotfix] fix zero init ctx numel (#1128) 3 years ago
shard_utils [gemini] add GeminiMemoryManger (#832) 3 years ago
sharded_model [hotfix] shared model returns cpu state_dict (#1328) 2 years ago
sharded_optim [hotfix] fix sharded optim step and clip_grad_norm (#1226) 2 years ago
sharded_param [gemini] add GeminiMemoryManger (#832) 3 years ago
utils [hotfix] remove potiential circle import (#1307) 2 years ago
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 3 years ago
zero_optimizer.py [zero] zero optim supports loading local state dict (#1171) 2 years ago