You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Super Daniel e8a9bebc87
[autoparallel] refactor and add rotorc. (#1789)
2 years ago
..
amp [doc] update rst and docstring (#1351) 2 years ago
auto_parallel [autoparallel] refactor and add rotorc. (#1789) 2 years ago
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2 years ago
cli
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2 years ago
context updated tp layers 2 years ago
device [autoparallel] add numerical test for node strategies (#1760) 2 years ago
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2 years ago
fx [autoparallel] refactor and add rotorc. (#1789) 2 years ago
gemini [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago
kernel [feat] add flash attention (#1762) 2 years ago
logging
nn [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago
pipeline [Pipeline]Adapt to Pipelinable OPT (#1782) 2 years ago
registry
tensor [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago
testing [unittest] added doc for the pytest wrapper (#1704) 2 years ago
trainer [NFC] polish _checkpoint_hook.py code style (#1722) 2 years ago
utils [zero] add constant placement policy (#1705) 2 years ago
zero [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago
__init__.py upgrade version to 0.1.11rc1 (#1739) 2 years ago
constants.py updated tp layers 2 years ago
core.py
global_variables.py updated tp layers 2 years ago
initialize.py