ColossalAI/colossalai/auto_parallel
YuliangLiu0306 37df666f38
[autoparallel] refactor handlers which reshape input tensors (#2615)
* [autoparallel] refactor handlers which reshape input tensors

* polish
2023-02-08 15:02:49 +08:00
..
checkpoint [hotfix] pass a parameter. (#2288) 2023-01-03 18:05:06 +08:00
meta_profiler [autoparallel] Patch meta information of `torch.matmul` (#2584) 2023-02-08 11:05:31 +08:00
passes add overlap option (#2613) 2023-02-08 15:02:31 +08:00
pipeline_shard [autoparallel] init new folder structure (#1696) 2022-10-13 14:18:55 +08:00
tensor_shard [autoparallel] refactor handlers which reshape input tensors (#2615) 2023-02-08 15:02:49 +08:00
__init__.py [autoparallel] standardize the code structure (#1469) 2022-08-19 15:51:54 +08:00