You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/auto_parallel
YuliangLiu0306 aa0f6686f9
[autoparallel] accelerate gpt2 training (#2495)
2 years ago
..
checkpoint [hotfix] pass a parameter. (#2288) 2 years ago
meta_profiler [autoparallel] bypass MetaInfo when unavailable and modify BCAST_FUNC_OP metainfo (#2293) 2 years ago
passes [autoparallel] accelerate gpt2 training (#2495) 2 years ago
pipeline_shard [autoparallel] init new folder structure (#1696) 2 years ago
tensor_shard [autoparallel] accelerate gpt2 training (#2495) 2 years ago
__init__.py [autoparallel] standardize the code structure (#1469) 2 years ago