You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel/tensor_shard
Hongxin Liu 554aa9592e
[legacy] move communication and nn to legacy and refactor logger (#4671)
1 year ago
..
node_handler [legacy] move communication and nn to legacy and refactor logger (#4671) 1 year ago
solver [NFC] fix typo with colossalai/auto_parallel/tensor_shard (#3742) 2 years ago
utils [test] fixed tests failed due to dtensor change (#4082) 1 year ago
__init__.py
constants.py
initialize.py [autoparallel]integrate auto parallel feature with new tracer (#3408) 2 years ago
options.py [autoparallel] add shard option (#2696) 2 years ago
sharding_strategy.py