You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
HELSON 1468e4bcfc
[zero] add constant placement policy (#1705)
2 years ago
..
checkpoint
data_sampler
model [zero] add constant placement policy (#1705) 2 years ago
multi_tensor_apply [NFC] polish colossalai/utils/multi_tensor_apply/multi_tensor_apply.py code style (#1559) 2 years ago
profiler
rank_recorder [pipeline/rank_recorder] fix bug when process data before backward | add a tool for multiple ranks debug (#1681) 2 years ago
tensor_detector [NFC] polish utils/tensor_detector/__init__.py code style (#1573) 2 years ago
__init__.py
activation_checkpoint.py [utils] Add use_reetrant=False in utils.activation_checkpoint (#1460) 2 years ago
checkpointing.py [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
common.py
cuda.py
memory.py
moe.py
timer.py