You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
Frank Lee 250be4d31e
[utils] integrated colotensor with lazy init context (#1324)
2 years ago
..
checkpoint [checkpoint] add ColoOptimizer checkpointing (#1316) 2 years ago
data_sampler Refactored docstring to google style 3 years ago
model [utils] integrated colotensor with lazy init context (#1324) 2 years ago
multi_tensor_apply Refactored docstring to google style 3 years ago
profiler [hotfix] remove potiential circle import (#1307) 2 years ago
tensor_detector Refactored docstring to google style 3 years ago
__init__.py [refactory] add nn.parallel module (#1068) 3 years ago
activation_checkpoint.py [util] fixed activation checkpointing on torch 1.9 (#719) 3 years ago
checkpointing.py polish checkpoint docstring (#637) 3 years ago
common.py [hotfix] fix sharded optim step and clip_grad_norm (#1226) 2 years ago
cuda.py [refactor] refactor the memory utils (#715) 3 years ago
memory.py [gemini] APIs to set cpu memory capacity (#809) 3 years ago
moe.py Refactored docstring to google style 3 years ago
timer.py Refactored docstring to google style 3 years ago