You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Frank Lee e2089c5c15
adapted for sequence parallel (#163)
3 years ago
..
amp adapted for sequence parallel (#163) 3 years ago
builder Optimize pipeline schedule (#94) 3 years ago
communication add scatter/gather optim for pipeline (#123) 3 years ago
context adapted for sequence parallel (#163) 3 years ago
engine adapted for sequence parallel (#163) 3 years ago
kernel adapted for sequence parallel (#163) 3 years ago
logging update default logger (#100) (#101) 3 years ago
nn adapted for sequence parallel (#163) 3 years ago
registry Develop/experiments (#59) 3 years ago
trainer Update layer integration documentations (#108) 3 years ago
utils adapted for sequence parallel (#163) 3 years ago
zero try import deepspeed when using zero (#130) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py Added MoE parallel (#127) 3 years ago
core.py Develop/experiments (#59) 3 years ago
global_variables.py Added MoE parallel (#127) 3 years ago
initialize.py adapted for sequence parallel (#163) 3 years ago