You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
YuliangLiu0306 cfadc9df8e
[cli] added distributed launcher command (#791)
3 years ago
..
amp [hotfix] fix memory leak in zero (#781) 3 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#662) 3 years ago
cli [cli] added distributed launcher command (#791) 3 years ago
communication [util] fixed communication API depth with PyTorch 1.9 (#721) 3 years ago
context [compatibility] used backward-compatible API for global process group (#758) 3 years ago
engine [refactor] moving memtracer to gemini (#801) 3 years ago
gemini [refactor] moving memtracer to gemini (#801) 3 years ago
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_adam.cu code style (#667) 3 years ago
logging Refactored docstring to google style 3 years ago
nn [TP] change the check assert in split batch 2d (#772) 3 years ago
registry Refactored docstring to google style 3 years ago
testing [test] added a decorator for address already in use error with backward compatibility (#760) 3 years ago
trainer [refactor] moving memtracer to gemini (#801) 3 years ago
utils [refactor] moving memtracer to gemini (#801) 3 years ago
zero [refactor] moving memtracer to gemini (#801) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py fix format constants.py (#358) 3 years ago
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py fix initialize about zero 3 years ago