You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel
Boyuan Yao a2b43e393d
[autoparallel] Patch meta information of `torch.nn.Embedding` (#2760)
2 years ago
..
checkpoint [hotfix] pass a parameter. (#2288) 2 years ago
meta_profiler [autoparallel] Patch meta information of `torch.nn.Embedding` (#2760) 2 years ago
passes [autoparallel] fix parameters sharding bug (#2716) 2 years ago
pipeline_shard
tensor_shard [autoparallel] distinguish different parallel strategies (#2699) 2 years ago
__init__.py