You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Super Daniel 0584654c79
[fx] refactor memory utils and extend shard utils. (#1754)
2 years ago
..
amp [doc] update rst and docstring (#1351) 2 years ago
auto_parallel [autoparallel] refactor the runtime apply pass and add docstring to passes (#1757) 2 years ago
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2 years ago
cli
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2 years ago
context [zero] add constant placement policy (#1705) 2 years ago
device [tensor]add 1D device mesh (#1492) 2 years ago
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2 years ago
fx [fx] refactor memory utils and extend shard utils. (#1754) 2 years ago
gemini [zero] add chunk init function for users (#1729) 2 years ago
kernel [hotfix] fix CPUAdam kernel nullptr (#1410) 2 years ago
logging
nn [NFC] polish colossalai/nn/metric/_utils.py code style (#1727) 2 years ago
pipeline fix file name (#1759) 2 years ago
registry
tensor [autoparallel] shard param and buffer as expected (#1753) 2 years ago
testing [unittest] added doc for the pytest wrapper (#1704) 2 years ago
trainer [NFC] polish _checkpoint_hook.py code style (#1722) 2 years ago
utils [zero] add constant placement policy (#1705) 2 years ago
zero [NFC] polish colossalai/zero/sharded_param/__init__.py code style (#1717) 2 years ago
__init__.py upgrade version to 0.1.11rc1 (#1739) 2 years ago
constants.py
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2 years ago
global_variables.py
initialize.py [hotfix] remove potiential circle import (#1307) 2 years ago