ColossalAI/colossalai/auto_parallel
Boyuan Yao a2b43e393d
[autoparallel] Patch meta information of `torch.nn.Embedding` (#2760)
* [autoparallel] embedding metainfo

* [autoparallel] fix function name in test_activation_metainfo

* [autoparallel] undo changes in activation metainfo and related tests
2023-02-17 10:39:48 +08:00
..
checkpoint [hotfix] pass a parameter. (#2288) 2023-01-03 18:05:06 +08:00
meta_profiler [autoparallel] Patch meta information of `torch.nn.Embedding` (#2760) 2023-02-17 10:39:48 +08:00
passes [autoparallel] fix parameters sharding bug (#2716) 2023-02-15 12:25:50 +08:00
pipeline_shard [autoparallel] init new folder structure (#1696) 2022-10-13 14:18:55 +08:00
tensor_shard [autoparallel] distinguish different parallel strategies (#2699) 2023-02-15 22:28:28 +08:00
__init__.py [autoparallel] standardize the code structure (#1469) 2022-08-19 15:51:54 +08:00