You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
ver217 9ef05ed1fc
try import deepspeed when using zero (#130)
3 years ago
..
amp Optimize pipeline schedule (#94) 3 years ago
builder Optimize pipeline schedule (#94) 3 years ago
communication add scatter/gather optim for pipeline (#123) 3 years ago
context Added MoE parallel (#127) 3 years ago
engine Added MoE parallel (#127) 3 years ago
kernel
logging update default logger (#100) (#101) 3 years ago
nn Added MoE parallel (#127) 3 years ago
registry
trainer fix a bug in timer (#114) 3 years ago
utils Added MoE parallel (#127) 3 years ago
zero try import deepspeed when using zero (#130) 3 years ago
__init__.py
constants.py Added MoE parallel (#127) 3 years ago
core.py
global_variables.py Added MoE parallel (#127) 3 years ago
initialize.py Added MoE parallel (#127) 3 years ago