You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
ver217 56b8863b87
[zero] chunk manager allows filtering ex-large params (#1393)
2 years ago
..
amp [doc] update rst and docstring (#1351) 2 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#1265) 2 years ago
cli
communication [NFC] polish colossalai/communication/collective.py (#1262) 2 years ago
context [doc] update rst and docstring (#1351) 2 years ago
engine [hotfix] fix PipelineSharedModuleGradientHandler (#1314) 2 years ago
fx [fx] patched torch.max and data movement operator (#1391) 2 years ago
gemini [zero] chunk manager allows filtering ex-large params (#1393) 2 years ago
kernel Recover kernal files 2 years ago
logging
nn [hotfix] adapt ProcessGroup and Optimizer to ColoTensor (#1388) 2 years ago
pipeline
registry
tensor [hotfix] adapt ProcessGroup and Optimizer to ColoTensor (#1388) 2 years ago
testing
trainer
utils [hotfix] fix a running error in test_colo_checkpoint.py (#1387) 2 years ago
zero [zero] zero optim state_dict takes only_rank_0 (#1384) 2 years ago
__init__.py [NFC] polish colossalai/__init__.py code style (#1285) 2 years ago
constants.py
core.py
global_variables.py
initialize.py [hotfix] remove potiential circle import (#1307) 2 years ago