ColossalAI/colossalai/auto_parallel/tensor_shard/node_handler
YuliangLiu0306 1dc003c169
[autoparallel] distinguish different parallel strategies (#2699)
2023-02-15 22:28:28 +08:00
..
strategy [autoparallel] distinguish different parallel strategies (#2699) 2023-02-15 22:28:28 +08:00
__init__.py [autoparallel] add shard option (#2696) 2023-02-15 13:48:28 +08:00
addmm_handler.py [autoparallel] support addmm in tracer and solver (#1961) 2022-11-16 14:59:18 +08:00
batch_norm_handler.py [autoparallel] use metainfo in handler (#2149) 2022-12-20 10:31:22 +08:00
binary_elementwise_handler.py [autoparallel] update binary elementwise handler (#2451) 2023-01-12 09:35:10 +08:00
bmm_handler.py [autoparallel]add essential CommActions for broadcast oprands (#1793) 2022-11-04 18:36:42 +08:00
conv_handler.py [autoparallel] use metainfo in handler (#2149) 2022-12-20 10:31:22 +08:00
default_reshape_handler.py [autoparallel] refactor handlers which reshape input tensors (#2615) 2023-02-08 15:02:49 +08:00
embedding_handler.py [autoparallel]add embedding handler (#2089) 2022-12-07 09:41:46 +08:00
getattr_handler.py [autoparallel] fix spelling error (#2270) 2023-01-03 16:13:00 +08:00
getitem_handler.py [NFC] polish colossalai/auto_parallel/tensor_shard/node_handler/getitem_handler.py code style (#2289) 2023-01-04 15:09:57 +08:00
layer_norm_handler.py [autoparallel] Patch meta information of `torch.nn.LayerNorm` (#2647) 2023-02-10 14:29:24 +08:00
linear_handler.py [autoparallel] distinguish different parallel strategies (#2699) 2023-02-15 22:28:28 +08:00
matmul_handler.py [autoparallel] Patch meta information of `torch.matmul` (#2584) 2023-02-08 11:05:31 +08:00
node_handler.py [autoparallel] add shard option (#2696) 2023-02-15 13:48:28 +08:00
normal_pooling_handler.py [autoparallel] use metainfo in handler (#2149) 2022-12-20 10:31:22 +08:00
output_handler.py [autoparallel] fix spelling error (#2270) 2023-01-03 16:13:00 +08:00
permute_handler.py [autoparallel] refactor handlers which reshape input tensors (#2615) 2023-02-08 15:02:49 +08:00
placeholder_handler.py [autoparallel] fix spelling error (#2270) 2023-01-03 16:13:00 +08:00
registry.py [autoparallel] added binary elementwise node handler (#1758) 2022-10-25 14:32:01 +08:00
softmax_handler.py [autoparallel] implement softmax handler (#2132) 2022-12-14 16:09:53 +08:00
split_handler.py [autoparallel] refactor handlers which reshape input tensors (#2615) 2023-02-08 15:02:49 +08:00
sum_handler.py [autoparallel] add sum handler (#2101) 2022-12-08 17:02:54 +08:00
tensor_constructor_handler.py [autoparallel] add tensor constructor handler (#2082) 2022-12-06 10:20:10 +08:00
transpose_handler.py [autoparallel] refactor handlers which reshape input tensors (#2615) 2023-02-08 15:02:49 +08:00
unary_elementwise_handler.py [autockpt] make it work. (#2257) 2023-01-02 23:37:45 +08:00
view_handler.py [autoparallel] refactor handlers which reshape input tensors (#2615) 2023-02-08 15:02:49 +08:00
where_handler.py [autoparallel] adapt handlers with attention block (#1990) 2022-11-21 10:44:11 +08:00