You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
ver217 821c6172e2
[utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442)
2 years ago
..
init_ctx [hotfix] fix zero init ctx numel (#1128) 2 years ago
shard_utils [gemini] add GeminiMemoryManger (#832) 3 years ago
sharded_model [hotfix] shared model returns cpu state_dict (#1328) 2 years ago
sharded_optim [checkpoint] sharded optim save/load grad scaler (#1350) 2 years ago
sharded_param [gemini] add GeminiMemoryManger (#832) 3 years ago
utils [hotfix] remove potiential circle import (#1307) 2 years ago
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 3 years ago
zero_optimizer.py [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442) 2 years ago