You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
HELSON 22c4b88d56
[zero] refactor ShardedParamV2 for convenience (#742)
3 years ago
..
amp [bug] fixed grad scaler compatibility with torch 1.8 (#735) 3 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#662) 3 years ago
communication [util] fixed communication API depth with PyTorch 1.9 (#721) 3 years ago
context [utils] support detection of number of processes on current node (#723) 3 years ago
engine [refactor] zero directory (#724) 3 years ago
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_adam.cu code style (#667) 3 years ago
logging Refactored docstring to google style 3 years ago
nn [compatibility] fixed tensor parallel compatibility with torch 1.9 (#700) 3 years ago
registry Refactored docstring to google style 3 years ago
testing [test] fixed rerun_on_exception and adapted test cases (#487) 3 years ago
trainer [utils] add synchronized cuda memory monitor (#740) 3 years ago
utils [utils] add synchronized cuda memory monitor (#740) 3 years ago
zero [zero] refactor ShardedParamV2 for convenience (#742) 3 years ago
__init__.py Develop/experiments (#59) 3 years ago
constants.py fix format constants.py (#358) 3 years ago
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py [utils] support detection of number of processes on current node (#723) 3 years ago