You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel/tensor_shard
YuliangLiu0306 fea3cb661c
[autoparallel] support addmm in tracer and solver (#1961)
2 years ago
..
deprecated [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/operator_handler.py code style (#1845) 2 years ago
node_handler [autoparallel] support addmm in tracer and solver (#1961) 2 years ago
solver [autoparallel] fix linear logical convert issue (#1857) 2 years ago
utils [autoparallel]add essential CommActions for broadcast oprands (#1793) 2 years ago
__init__.py
constants.py
sharding_strategy.py [fx] Add linear metainfo class for auto parallel (#1783) 2 years ago