ColossalAI/colossalai/auto_parallel/tensor_shard
YuliangLiu0306 35e6b9ec82
[autoparallel] adapt handlers with attention block (#1990)
* [autoparallel] adapt handlers with attention block

* polish
2022-11-21 10:44:11 +08:00
..
deprecated [NFC] polish colossalai/auto_parallel/tensor_shard/deprecated/op_handler/operator_handler.py code style (#1845) 2022-11-09 12:08:47 +08:00
node_handler [autoparallel] adapt handlers with attention block (#1990) 2022-11-21 10:44:11 +08:00
solver [autoparallel] support distributed dataloader option (#1906) 2022-11-17 20:11:53 +08:00
utils [autoparallel]add essential CommActions for broadcast oprands (#1793) 2022-11-04 18:36:42 +08:00
__init__.py [autoparallel] init new folder structure (#1696) 2022-10-13 14:18:55 +08:00
constants.py [autoparallel] added binary elementwise node handler (#1758) 2022-10-25 14:32:01 +08:00
sharding_strategy.py [autoparallel] support distributed dataloader option (#1906) 2022-11-17 20:11:53 +08:00