ColossalAI/colossalai/nn
HELSON d7ea63992b
[MOE] add FP32LinearGate for MOE in NaiveAMP context (#480)
2022-03-22 10:50:20 +08:00
..
layer [MOE] add FP32LinearGate for MOE in NaiveAMP context (#480) 2022-03-22 10:50:20 +08:00
loss [MOE] polish moe_env (#467) 2022-03-19 15:36:25 +08:00
lr_scheduler Fixed docstring in colossalai (#171) 2022-01-21 10:44:30 +08:00
metric fixed CI dataset directory; fixed import error of 2.5d accuracy (#255) 2022-03-11 15:50:28 +08:00
model Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
optimizer [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 2022-03-21 13:35:04 +08:00
__init__.py Layer integration (#83) 2021-12-27 15:04:32 +08:00
init.py Layer integration (#83) 2021-12-27 15:04:32 +08:00