You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
Frank Lee e2089c5c15
adapted for sequence parallel (#163)
3 years ago
..
data_sampler update examples and sphnix docs for the new api (#63) 3 years ago
gradient_accumulation Optimize pipeline schedule (#94) 3 years ago
multi_tensor_apply update examples and sphnix docs for the new api (#63) 3 years ago
__init__.py adapted for sequence parallel (#163) 3 years ago
activation_checkpoint.py Migrated project 3 years ago
checkpointing.py Support TP-compatible Torch AMP and Update trainer API (#27) 3 years ago
common.py adapted for sequence parallel (#163) 3 years ago
cuda.py Migrated project 3 years ago
memory.py adapted for sequence parallel (#163) 3 years ago
timer.py adapted for sequence parallel (#163) 3 years ago