You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
HELSON d7ea63992b
[MOE] add FP32LinearGate for MOE in NaiveAMP context (#480)
3 years ago
..
layer [MOE] add FP32LinearGate for MOE in NaiveAMP context (#480) 3 years ago
loss [MOE] polish moe_env (#467) 3 years ago
lr_scheduler Fixed docstring in colossalai (#171) 3 years ago
metric fixed CI dataset directory; fixed import error of 2.5d accuracy (#255) 3 years ago
model
optimizer [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
__init__.py
init.py