You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
BoxiangW 4a3d3446b0
Update layer integration documentations (#108)
3 years ago
..
amp Optimize pipeline schedule (#94) 3 years ago
builder Optimize pipeline schedule (#94) 3 years ago
communication add scatter/gather optim for pipeline (#123) 3 years ago
context Added MoE parallel (#127) 3 years ago
engine Added MoE parallel (#127) 3 years ago
kernel add colossalai kernel module (#55) 3 years ago
logging update default logger (#100) (#101) 3 years ago
nn Update layer integration documentations (#108) 3 years ago
registry Develop/experiments (#59) 3 years ago
trainer Update layer integration documentations (#108) 3 years ago
utils Added MoE parallel (#127) 3 years ago
zero try import deepspeed when using zero (#130) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py Added MoE parallel (#127) 3 years ago
core.py Develop/experiments (#59) 3 years ago
global_variables.py Added MoE parallel (#127) 3 years ago
initialize.py Added MoE parallel (#127) 3 years ago