You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/lr_scheduler
HELSON dceae85195
Added MoE parallel (#127)
3 years ago
..
__init__.py Support TP-compatible Torch AMP and Update trainer API (#27) 3 years ago
cosine.py Added MoE parallel (#127) 3 years ago
delayed.py Develop/experiments (#59) 3 years ago
linear.py Support TP-compatible Torch AMP and Update trainer API (#27) 3 years ago
multistep.py Develop/experiments (#59) 3 years ago
onecycle.py Develop/experiments (#59) 3 years ago
poly.py Develop/experiments (#59) 3 years ago
torch.py Develop/experiments (#59) 3 years ago