mirror of https://github.com/hpcaitech/ColossalAI
0e484e6201
* fix typo colossalai/autochunk auto_parallel amp * fix typo colossalai/auto_parallel nn utils etc. * fix typo colossalai/auto_parallel autochunk fx/passes etc. * fix typo docs/ * change placememt_policy to placement_policy in docs/ and examples/ * fix typo colossalai/ applications/ * fix typo colossalai/cli fx kernel * fix typo colossalai/nn * revert change warmuped * fix typo colossalai/pipeline tensor nn |
||
---|---|---|
.. | ||
_ops | ||
layer | ||
loss | ||
lr_scheduler | ||
metric | ||
optimizer | ||
parallel | ||
__init__.py | ||
init.py |