You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Jiarui Fang e761ad2cd7
Revert "[zero] add ZeroTensorShardStrategy (#793)" (#806)
3 years ago
..
amp [hotfix] fix memory leak in zero (#781) 3 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#662) 3 years ago
cli [cli] add missing requirement (#805) 3 years ago
communication [util] fixed communication API depth with PyTorch 1.9 (#721) 3 years ago
context [compatibility] used backward-compatible API for global process group (#758) 3 years ago
engine [refactor] moving grad acc logic to engine (#804) 3 years ago
gemini [refactor] moving grad acc logic to engine (#804) 3 years ago
kernel Revert "[zero] add ZeroTensorShardStrategy (#793)" (#806) 3 years ago
logging Refactored docstring to google style 3 years ago
nn [TP] change the check assert in split batch 2d (#772) 3 years ago
registry Refactored docstring to google style 3 years ago
testing [test] added a decorator for address already in use error with backward compatibility (#760) 3 years ago
trainer [refactor] moving memtracer to gemini (#801) 3 years ago
utils [refactor] moving grad acc logic to engine (#804) 3 years ago
zero Revert "[zero] add ZeroTensorShardStrategy (#793)" (#806) 3 years ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py [refactor] moving grad acc logic to engine (#804) 3 years ago