You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
3 years ago
..
amp
builder
communication
context [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
engine [MOE] polish moe_env (#467) 3 years ago
kernel
logging
nn [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
registry
testing optimized context test time consumption (#446) 3 years ago
trainer
utils [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
zero zero init ctx receives a dp process group (#471) 3 years ago
__init__.py
constants.py
core.py [MOE] polish moe_env (#467) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago