mirror of https://github.com/hpcaitech/ColossalAI
![]() * create shardformer doc files * add docstring for seq-parallel * update ShardConfig docstring * add links to llama example * add outdated massage * finish introduction & supporting information * finish 'how shardformer works' * finish shardformer.md English doc * fix doctest fail * add Chinese document |
||
---|---|---|
.. | ||
_C | ||
_analyzer | ||
amp | ||
auto_parallel | ||
autochunk | ||
booster | ||
checkpoint_io | ||
cli | ||
cluster | ||
context | ||
device | ||
fx | ||
inference | ||
interface | ||
kernel | ||
lazy | ||
legacy | ||
logging | ||
nn | ||
pipeline | ||
shardformer | ||
tensor | ||
testing | ||
utils | ||
zero | ||
__init__.py | ||
constants.py | ||
core.py | ||
global_variables.py | ||
initialize.py |