ColossalAI/colossalai/auto_parallel/meta_profiler/meta_registry
Boyuan Yao a2b43e393d
[autoparallel] Patch meta information of `torch.nn.Embedding` (#2760)
* [autoparallel] embedding metainfo

* [autoparallel] fix function name in test_activation_metainfo

* [autoparallel] undo changes in activation metainfo and related tests
2023-02-17 10:39:48 +08:00
..
__init__.py [autoparallel] Patch meta information of `torch.nn.Embedding` (#2760) 2023-02-17 10:39:48 +08:00
activation.py [autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674) 2023-02-13 16:09:22 +08:00
binary_elementwise_ops.py [autoparallel] bypass MetaInfo when unavailable and modify BCAST_FUNC_OP metainfo (#2293) 2023-01-03 20:28:01 +08:00
conv.py [autoparallel] Attach input, buffer and output tensor to MetaInfo class (#2162) 2022-12-28 13:37:40 +08:00
embedding.py [autoparallel] Patch meta information of `torch.nn.Embedding` (#2760) 2023-02-17 10:39:48 +08:00
linear.py [autoparallel] Patch meta information of `torch.matmul` (#2584) 2023-02-08 11:05:31 +08:00
norm.py [autoparallel] Patch meta information of `torch.nn.LayerNorm` (#2647) 2023-02-10 14:29:24 +08:00
pooling.py [autoparallel] patch torch.flatten metainfo for autoparallel (#2247) 2023-01-02 15:51:03 +08:00