You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
Jiarui Fang 7ef3507ace
[zero] show model data cuda memory usage after zero context init. (#515)
3 years ago
..
data_sampler fixed utils docstring and add example to readme (#200) 3 years ago
gradient_accumulation Fixed docstring in colossalai (#171) 3 years ago
memory_tracer [zero] show model data cuda memory usage after zero context init. (#515) 3 years ago
memory_utils [memory] set cuda mem frac (#506) 3 years ago
multi_tensor_apply Fixed docstring in colossalai (#171) 3 years ago
profiler fixed error when no collective communication in CommProfiler 3 years ago
tensor_detector Added tensor detector (#393) 3 years ago
__init__.py [memory] add model data tensor moving api (#503) 3 years ago
activation_checkpoint.py Added activation offload (#331) 3 years ago
checkpointing.py fixed mkdir conflict and align yapf config with flake (#220) 3 years ago
common.py [MOE] remove old MoE legacy (#493) 3 years ago
cuda.py Fixed docstring in colossalai (#171) 3 years ago
moe.py [polish] polish singleton and global context (#500) 3 years ago
timer.py [profiler] primary memory tracer 3 years ago