You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
ver217 8106d7b8c7
[ddp] refactor ColoDDP and ZeroDDP (#1146)
2 years ago
..
init_ctx [hotfix] fix zero init ctx numel (#1128) 2 years ago
shard_utils [gemini] add GeminiMemoryManger (#832) 3 years ago
sharded_model [hotfix] prevent nested ZeRO (#1140) 2 years ago
sharded_optim [hotfix] prevent nested ZeRO (#1140) 2 years ago
sharded_param [gemini] add GeminiMemoryManger (#832) 3 years ago
utils [zero] avoid zero hook spam by changing log to debug level (#1137) 2 years ago
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 3 years ago
zero_optimizer.py [ddp] refactor ColoDDP and ZeroDDP (#1146) 2 years ago