You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Frank Lee f3802d6b06
fixed jit default setting (#154)
3 years ago
..
amp Optimize pipeline schedule (#94) 3 years ago
builder Optimize pipeline schedule (#94) 3 years ago
communication add scatter/gather optim for pipeline (#123) 3 years ago
context Added MoE parallel (#127) 3 years ago
engine pipeline last stage supports multi output (#151) 3 years ago
kernel fixed jit default setting (#154) 3 years ago
logging update default logger (#100) (#101) 3 years ago
nn refactor kernel (#142) 3 years ago
registry Develop/experiments (#59) 3 years ago
trainer Update layer integration documentations (#108) 3 years ago
utils Added MoE parallel (#127) 3 years ago
zero try import deepspeed when using zero (#130) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py Added MoE parallel (#127) 3 years ago
core.py Develop/experiments (#59) 3 years ago
global_variables.py Added MoE parallel (#127) 3 years ago
initialize.py Added MoE parallel (#127) 3 years ago