You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
HELSON 055fbf5be6
[zero] adapt zero for unsharded paramters (Optimizer part) (#601)
3 years ago
..
data_sampler
gradient_accumulation
memory_tracer polish utils docstring (#620) 3 years ago
memory_utils
multi_tensor_apply
profiler polish utils docstring (#620) 3 years ago
tensor_detector
__init__.py [model checkpoint] updated checkpoint save/load utils (#592) 3 years ago
activation_checkpoint.py
checkpointing.py [zero] adapt zero for unsharded paramters (Optimizer part) (#601) 3 years ago
common.py
cuda.py
moe.py
timer.py