You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
ver217 7be397ca9c
[log] polish disable_existing_loggers (#519)
3 years ago
..
amp [hotfix] fix initialize bug with zero (#442) 3 years ago
builder add pytorch hooks (#179) 3 years ago
communication fix format (#332) 3 years ago
context [polish] polish singleton and global context (#500) 3 years ago
engine [polish] polish singleton and global context (#500) 3 years ago
kernel [formart] format fixed for kernel\cuda_native codes (#335) 3 years ago
logging [log] polish disable_existing_loggers (#519) 3 years ago
nn [polish] polish singleton and global context (#500) 3 years ago
registry add pytorch hooks (#179) 3 years ago
testing [test] added rerun on exception for testing (#475) 3 years ago
trainer Added profiler communication operations 3 years ago
utils [zero] fix init device bug in zero init context unittest (#516) 3 years ago
zero [zero] fix init device bug in zero init context unittest (#516) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py fix format constants.py (#358) 3 years ago
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py [polish] polish singleton and global context (#500) 3 years ago