You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
HELSON 943a96323e
[hotfix] fix no optimizer in save/load (#1363)
2 years ago
..
amp [doc] update rst and docstring (#1351) 2 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#1265) 2 years ago
cli [hotfix] fix some bugs caused by size mismatch. (#1011) 3 years ago
communication [NFC] polish colossalai/communication/collective.py (#1262) 2 years ago
context [doc] update rst and docstring (#1351) 2 years ago
engine [hotfix] fix PipelineSharedModuleGradientHandler (#1314) 2 years ago
fx [fx] added activation checkpoint codegen support for torch < 1.12 (#1359) 2 years ago
gemini [doc] update rst and docstring (#1351) 2 years ago
kernel Recover kernal files 2 years ago
logging [doc] improved docstring in the logging module (#861) 3 years ago
nn [doc] update rst and docstring (#1351) 2 years ago
pipeline [pipeline]add customized policy (#1139) 2 years ago
registry Remove duplication registry (#1078) 3 years ago
tensor [hotfix] fix no optimizer in save/load (#1363) 2 years ago
testing [test] skip tests when not enough GPUs are detected (#1090) 3 years ago
trainer fix issue #1080 (#1071) 3 years ago
utils [hotfix] fix no optimizer in save/load (#1363) 2 years ago
zero fix zero optim backward_by_grad and save/load (#1353) 2 years ago
__init__.py [NFC] polish colossalai/__init__.py code style (#1285) 2 years ago
constants.py fix typo in constants (#1027) 3 years ago
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py [hotfix] remove potiential circle import (#1307) 2 years ago