ColossalAI/colossalai/nn/optimizer
ver217 dbe62c67b8
add an example of ViT-B/16 and remove w_norm clipping in LAMB (#29)
2021-11-18 23:45:09 +08:00
..
__init__.py Migrated project 2021-10-28 18:21:23 +02:00
_utils.py Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
fp16_optimizer.py Migrated project 2021-10-28 18:21:23 +02:00
fused_adam.py Migrated project 2021-10-28 18:21:23 +02:00
fused_lamb.py Migrated project 2021-10-28 18:21:23 +02:00
fused_sgd.py Migrated project 2021-10-28 18:21:23 +02:00
lamb.py add an example of ViT-B/16 and remove w_norm clipping in LAMB (#29) 2021-11-18 23:45:09 +08:00
lars.py update documentation 2021-10-29 09:29:20 +08:00
loss_scaler.py Migrated project 2021-10-28 18:21:23 +02:00
zero_redundancy_optimizer_level_1.py Migrated project 2021-10-28 18:21:23 +02:00
zero_redundancy_optimizer_level_2.py Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00
zero_redundancy_optimizer_level_3.py Support TP-compatible Torch AMP and Update trainer API (#27) 2021-11-18 19:45:06 +08:00