ColossalAI/colossalai/nn
ver217 62b0a8d644
[zero] sharded optim support hybrid cpu adam (#486)
* sharded optim support hybrid cpu adam

* update unit test

* polish docstring
2022-03-22 14:56:59 +08:00
..
layer [MOE] add FP32LinearGate for MOE in NaiveAMP context (#480) 2022-03-22 10:50:20 +08:00
loss [MOE] polish moe_env (#467) 2022-03-19 15:36:25 +08:00
lr_scheduler Fixed docstring in colossalai (#171) 2022-01-21 10:44:30 +08:00
metric fixed CI dataset directory; fixed import error of 2.5d accuracy (#255) 2022-03-11 15:50:28 +08:00
model Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
optimizer [zero] sharded optim support hybrid cpu adam (#486) 2022-03-22 14:56:59 +08:00
__init__.py Layer integration (#83) 2021-12-27 15:04:32 +08:00
init.py Layer integration (#83) 2021-12-27 15:04:32 +08:00