You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
puck_WCR 9473a1b9c8
AMP docstring/markdown update (#160)
3 years ago
..
amp AMP docstring/markdown update (#160) 3 years ago
builder Optimize pipeline schedule (#94) 3 years ago
communication add scatter/gather optim for pipeline (#123) 3 years ago
context Added MoE parallel (#127) 3 years ago
engine pipeline last stage supports multi output (#151) 3 years ago
kernel fixed jit default setting (#154) 3 years ago
logging update default logger (#100) (#101) 3 years ago
nn refactor kernel (#142) 3 years ago
registry Develop/experiments (#59) 3 years ago
trainer Update layer integration documentations (#108) 3 years ago
utils Added MoE parallel (#127) 3 years ago
zero try import deepspeed when using zero (#130) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py Added MoE parallel (#127) 3 years ago
core.py Develop/experiments (#59) 3 years ago
global_variables.py Added MoE parallel (#127) 3 years ago
initialize.py Added MoE parallel (#127) 3 years ago