You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
HELSON dceae85195
Added MoE parallel (#127)
3 years ago
..
data_sampler update examples and sphnix docs for the new api (#63) 3 years ago
gradient_accumulation Optimize pipeline schedule (#94) 3 years ago
multi_tensor_apply update examples and sphnix docs for the new api (#63) 3 years ago
__init__.py Hotfix/Colossalai layers (#92) 3 years ago
activation_checkpoint.py Migrated project 3 years ago
checkpointing.py Support TP-compatible Torch AMP and Update trainer API (#27) 3 years ago
common.py Added MoE parallel (#127) 3 years ago
cuda.py Migrated project 3 years ago
memory.py Layer integration (#83) 3 years ago
timer.py update examples and sphnix docs for the new api (#63) 3 years ago