You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
3 years ago
..
commons [hotfix] fix initialize bug with zero (#442) 3 years ago
data_sampler fixed utils docstring and add example to readme (#200) 3 years ago
gradient_accumulation Fixed docstring in colossalai (#171) 3 years ago
memory_tracer fixed mem monitor device (#433) 3 years ago
multi_tensor_apply Fixed docstring in colossalai (#171) 3 years ago
profiler fixed error when no collective communication in CommProfiler 3 years ago
tensor_detector Added tensor detector (#393) 3 years ago
__init__.py Added tensor detector (#393) 3 years ago
activation_checkpoint.py Added activation offload (#331) 3 years ago
checkpointing.py fixed mkdir conflict and align yapf config with flake (#220) 3 years ago
common.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
cuda.py Fixed docstring in colossalai (#171) 3 years ago
memory.py Fixed docstring in colossalai (#171) 3 years ago
moe.py [MOE] polish moe_env (#467) 3 years ago
timer.py [profiler] primary memory tracer 3 years ago