You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
Jiarui Fang a444633d13
warmup ratio configration (#1192)
2 years ago
..
init_ctx [hotfix] fix zero init ctx numel (#1128) 2 years ago
shard_utils [gemini] add GeminiMemoryManger (#832) 3 years ago
sharded_model warmup ratio configration (#1192) 2 years ago
sharded_optim [zero] sharded optim supports loading local state dict (#1170) 2 years ago
sharded_param [gemini] add GeminiMemoryManger (#832) 3 years ago
utils [refactor] move chunk and chunkmgr to directory gemini (#1182) 2 years ago
__init__.py [zero] add zero optimizer for ColoTensor (#1046) 3 years ago
zero_optimizer.py [zero] zero optim supports loading local state dict (#1171) 2 years ago