mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
40 lines
708 B
40 lines
708 B
.. Colossal-AI documentation master file, created by
|
|
sphinx-quickstart on Mon Oct 11 17:05:05 2021.
|
|
You can adapt this file completely to your liking, but it should at least
|
|
contain the root `toctree` directive.
|
|
|
|
Colossal-AI documentation
|
|
======================================
|
|
.. toctree::
|
|
:maxdepth: 1
|
|
:caption: GETTING STARTED
|
|
|
|
installation.md
|
|
run_demo.md
|
|
|
|
|
|
.. toctree::
|
|
:maxdepth: 1
|
|
:caption: CUSTOMIZE YOUR TRAINING
|
|
|
|
parallelization.md
|
|
model.md
|
|
trainer_engine.md
|
|
amp.md
|
|
zero.md
|
|
add_your_parallel.md
|
|
config.md
|
|
|
|
|
|
|
|
.. toctree::
|
|
:maxdepth: 2
|
|
:caption: API REFERENCE
|
|
|
|
colossalai/colossalai
|
|
|
|
|
|
Indices and tables
|
|
==================
|
|
|
|
* :ref:`genindex` |