You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/optimizer
ver217 9ec1ce6ab1
[zero] sharded model support the reuse of fp16 shard (#495)
3 years ago
..
__init__.py [zero] cpu adam kernel (#288) 3 years ago
colossalai_optimizer.py Develop/experiments (#59) 3 years ago
cpu_adam.py [zero] sharded model support the reuse of fp16 shard (#495) 3 years ago
fused_adam.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
fused_lamb.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
fused_sgd.py refactor kernel (#142) 3 years ago
lamb.py Fixed docstring in colossalai (#171) 3 years ago
lars.py Fixed docstring in colossalai (#171) 3 years ago