Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
HELSON 84fd7c1d4d
add moe context, moe utilities and refactor gradient handler (#455)
3 years ago
..
gradient_handler add moe context, moe utilities and refactor gradient handler (#455) 3 years ago
ophooks [zero] Update initialize for ZeRO (#458) 3 years ago
paramhooks Fix/format colossalai/engine/paramhooks/(#350) 3 years ago
schedule Feature/zero (#279) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
_base_engine.py set criterion as optional in colossalai initialize (#336) 3 years ago