mirror of https://github.com/hpcaitech/ColossalAI
f911d5b09d
* create shardformer doc files * add docstring for seq-parallel * update ShardConfig docstring * add links to llama example * add outdated massage * finish introduction & supporting information * finish 'how shardformer works' * finish shardformer.md English doc * fix doctest fail * add Chinese document |
||
---|---|---|
.. | ||
__init__.py | ||
dp_plugin_base.py | ||
gemini_plugin.py | ||
hybrid_parallel_plugin.py | ||
low_level_zero_plugin.py | ||
plugin_base.py | ||
pp_plugin_base.py | ||
torch_ddp_plugin.py | ||
torch_fsdp_plugin.py |