You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
Frank Lee e6ec99d389
[utils] fixed lazy init context (#1867)
2 years ago
..
checkpoint [hotfix] fix a running error in test_colo_checkpoint.py (#1387) 2 years ago
checkpoint_io [CheckpointIO] a uniform checkpoint I/O module (#1689) 2 years ago
data_sampler Refactored docstring to google style 3 years ago
model [utils] fixed lazy init context (#1867) 2 years ago
multi_tensor_apply [NFC] polish colossalai/utils/multi_tensor_apply/multi_tensor_apply.py code style (#1559) 2 years ago
profiler [hotfix] remove potiential circle import (#1307) 2 years ago
rank_recorder [pipeline/rank_recorder] fix bug when process data before backward | add a tool for multiple ranks debug (#1681) 2 years ago
tensor_detector [NFC] polish utils/tensor_detector/__init__.py code style (#1573) 2 years ago
__init__.py [refactory] add nn.parallel module (#1068) 3 years ago
activation_checkpoint.py [utils] Add use_reetrant=False in utils.activation_checkpoint (#1460) 2 years ago
checkpointing.py [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
common.py [test] fixed the activation codegen test (#1447) 2 years ago
cuda.py [refactor] refactor the memory utils (#715) 3 years ago
memory.py [gemini] APIs to set cpu memory capacity (#809) 3 years ago
moe.py Refactored docstring to google style 3 years ago
timer.py Refactored docstring to google style 3 years ago