You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/optimizer
Frank Lee 3defa32aee
Support TP-compatible Torch AMP and Update trainer API (#27)
3 years ago
..
__init__.py
_utils.py Support TP-compatible Torch AMP and Update trainer API (#27) 3 years ago
fp16_optimizer.py
fused_adam.py
fused_lamb.py
fused_sgd.py
lamb.py
lars.py
loss_scaler.py
zero_redundancy_optimizer_level_1.py
zero_redundancy_optimizer_level_2.py Support TP-compatible Torch AMP and Update trainer API (#27) 3 years ago
zero_redundancy_optimizer_level_3.py Support TP-compatible Torch AMP and Update trainer API (#27) 3 years ago