ColossalAI/colossalai/auto_parallel
YuliangLiu0306 5b24987fa7
[autoparallel] fix parameters sharding bug (#2716)
2023-02-15 12:25:50 +08:00
..
checkpoint [hotfix] pass a parameter. (#2288) 2023-01-03 18:05:06 +08:00
meta_profiler [autoparallel] Patch meta information of `torch.nn.functional.softmax` and `torch.nn.Softmax` (#2674) 2023-02-13 16:09:22 +08:00
passes [autoparallel] fix parameters sharding bug (#2716) 2023-02-15 12:25:50 +08:00
pipeline_shard [autoparallel] init new folder structure (#1696) 2022-10-13 14:18:55 +08:00
tensor_shard [autoparallel] remove deprecated codes (#2664) 2023-02-15 09:54:32 +08:00
__init__.py [autoparallel] standardize the code structure (#1469) 2022-08-19 15:51:54 +08:00