ColossalAI/colossalai/nn/optimizer
LuGY 105c5301c3
[zero]added hybrid adam, removed loss scale in adam (#527)
* [zero]added hybrid adam, removed loss scale of adam

* remove useless code
2022-03-25 18:03:54 +08:00
..
__init__.py [zero] cpu adam kernel (#288) 2022-03-11 15:50:28 +08:00
colossalai_optimizer.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
cpu_adam.py [zero]added hybrid adam, removed loss scale in adam (#527) 2022-03-25 18:03:54 +08:00
fused_adam.py [cuda] modify the fused adam, support hybrid of fp16 and fp32 (#497) 2022-03-25 14:15:53 +08:00
fused_lamb.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 2022-03-21 13:35:04 +08:00
fused_sgd.py refactor kernel (#142) 2022-01-13 16:47:17 +08:00
hybrid_adam.py [zero]added hybrid adam, removed loss scale in adam (#527) 2022-03-25 18:03:54 +08:00
lamb.py Fixed docstring in colossalai (#171) 2022-01-21 10:44:30 +08:00
lars.py Fixed docstring in colossalai (#171) 2022-01-21 10:44:30 +08:00