ColossalAI/colossalai/booster/plugin
Baizhou Zhang f911d5b09d
[doc] Add user document for Shardformer (#4702)
* create shardformer doc files

* add docstring for seq-parallel

* update ShardConfig docstring

* add links to llama example

* add outdated massage

* finish introduction & supporting information

* finish 'how shardformer works'

* finish shardformer.md English doc

* fix doctest fail

* add Chinese document
2023-09-15 10:56:39 +08:00
..
__init__.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py Merge branch 'main' into feature/shardformer 2023-09-04 23:43:13 +08:00
hybrid_parallel_plugin.py [doc] Add user document for Shardformer (#4702) 2023-09-15 10:56:39 +08:00
low_level_zero_plugin.py [zero] hotfix master param sync (#4618) 2023-09-05 15:04:02 +08:00
plugin_base.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
pp_plugin_base.py [pipeline] set optimizer to optional in execute_pipeline (#4630) 2023-09-07 10:42:59 +08:00
torch_ddp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
torch_fsdp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00