You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer/moe
HELSON d7ea63992b
[MOE] add FP32LinearGate for MOE in NaiveAMP context (#480)
3 years ago
..
__init__.py [MOE] changed parallelmode to dist process group (#460) 3 years ago
_operation.py [MOE] polish moe_env (#467) 3 years ago
experts.py [format] polish name format for MOE (#481) 3 years ago
layers.py [MOE] add FP32LinearGate for MOE in NaiveAMP context (#480) 3 years ago
utils.py [MOE] add FP32LinearGate for MOE in NaiveAMP context (#480) 3 years ago