You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Frank Lee eee84908d4
[autoparallel] handled illegal sharding strategy (#1728)
2 years ago
..
amp
auto_parallel [autoparallel] handled illegal sharding strategy (#1728) 2 years ago
builder
cli
communication
context [zero] add constant placement policy (#1705) 2 years ago
device
engine
fx [autoparallel] runtime_backward_apply (#1720) 2 years ago
gemini [zero] add chunk init function for users (#1729) 2 years ago
kernel
logging
nn [NFC] polish colossalai/nn/metric/_utils.py code style (#1727) 2 years ago
pipeline [fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) 2 years ago
registry
tensor [autoparallel] handled illegal sharding strategy (#1728) 2 years ago
testing [unittest] added doc for the pytest wrapper (#1704) 2 years ago
trainer [NFC] polish _checkpoint_hook.py code style (#1722) 2 years ago
utils [zero] add constant placement policy (#1705) 2 years ago
zero [NFC] polish colossalai/zero/sharded_param/__init__.py code style (#1717) 2 years ago
__init__.py upgrade version to 0.1.11rc1 (#1739) 2 years ago
constants.py
core.py
global_variables.py
initialize.py