You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Jiarui Fang 7d81b5b46e
[logging] polish logger format (#543)
3 years ago
..
amp [hotfix] fix initialize bug with zero (#442) 3 years ago
builder
communication fix format (#332) 3 years ago
context [polish] polish singleton and global context (#500) 3 years ago
engine [zero] adapt for no-leaf module in zero (#535) 3 years ago
kernel [cuda] modify the fused adam, support hybrid of fp16 and fp32 (#497) 3 years ago
logging [logging] polish logger format (#543) 3 years ago
nn [zero]added hybrid adam, removed loss scale in adam (#527) 3 years ago
registry
testing [test] fixed rerun_on_exception and adapted test cases (#487) 3 years ago
trainer Added profiler communication operations 3 years ago
utils [zero] get memory usage of sharded optim v2. (#542) 3 years ago
zero [zero] polish ZeroInitContext (#540) 3 years ago
__init__.py
constants.py fix format constants.py (#358) 3 years ago
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py [polish] polish singleton and global context (#500) 3 years ago