You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
HELSON e6d50ec107
[zero] adapt zero for unsharded parameters (#561)
3 years ago
..
data_sampler Refactored docstring to google style 3 years ago
gradient_accumulation Refactored docstring to google style 3 years ago
memory_tracer [zero] adapt zero for unsharded parameters (#561) 3 years ago
memory_utils [zero] trace states of fp16/32 grad and fp32 param (#571) 3 years ago
multi_tensor_apply Refactored docstring to google style 3 years ago
profiler html refactor (#555) 3 years ago
tensor_detector Refactored docstring to google style 3 years ago
__init__.py [memory] add model data tensor moving api (#503) 3 years ago
activation_checkpoint.py Refactored docstring to google style 3 years ago
checkpointing.py Refactored docstring to google style 3 years ago
common.py Refactored docstring to google style 3 years ago
cuda.py Fixed docstring in colossalai (#171) 3 years ago
moe.py Refactored docstring to google style 3 years ago
timer.py Refactored docstring to google style 3 years ago