You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
HELSON 055fbf5be6
[zero] adapt zero for unsharded paramters (Optimizer part) (#601)
3 years ago
..
amp polish amp docstring (#616) 3 years ago
builder html refactor (#555) 3 years ago
communication [model checkpoint] updated communication ops for cpu tensors (#590) 3 years ago
context [model checkpoint] add gloo groups for cpu tensor communication (#589) 3 years ago
engine [refactor] memory utils (#577) 3 years ago
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/kernels/cuda_util.cu code stype (#625) 3 years ago
logging Refactored docstring to google style 3 years ago
nn [model checkpoint] updated saving/loading for 3d layers (#597) 3 years ago
registry Refactored docstring to google style 3 years ago
testing [test] fixed rerun_on_exception and adapted test cases (#487) 3 years ago
trainer [model checkpoint] updated checkpoint hook (#598) 3 years ago
utils [zero] adapt zero for unsharded paramters (Optimizer part) (#601) 3 years ago
zero [zero] adapt zero for unsharded paramters (Optimizer part) (#601) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py fix format constants.py (#358) 3 years ago
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py Refactored docstring to google style 3 years ago