You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
HELSON 84fd7c1d4d
add moe context, moe utilities and refactor gradient handler (#455)
3 years ago
..
commons [hotfix] fix initialize bug with zero (#442) 3 years ago
data_sampler fixed utils docstring and add example to readme (#200) 3 years ago
gradient_accumulation Fixed docstring in colossalai (#171) 3 years ago
memory_tracer fixed mem monitor device (#433) 3 years ago
multi_tensor_apply Fixed docstring in colossalai (#171) 3 years ago
profiler fixed error when no collective communication in CommProfiler 3 years ago
tensor_detector Added tensor detector (#393) 3 years ago
__init__.py Added tensor detector (#393) 3 years ago
activation_checkpoint.py Added activation offload (#331) 3 years ago
checkpointing.py fixed mkdir conflict and align yapf config with flake (#220) 3 years ago
common.py optimized context test time consumption (#446) 3 years ago
cuda.py Fixed docstring in colossalai (#171) 3 years ago
memory.py Fixed docstring in colossalai (#171) 3 years ago
moe.py add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
timer.py [profiler] primary memory tracer 3 years ago