mirror of https://github.com/hpcaitech/ColossalAI
24 lines
500 B
ReStructuredText
24 lines
500 B
ReStructuredText
colossalai.context
|
|
==================
|
|
|
|
*This module is serving for receiving and passing user's configuration to all devices to
|
|
initialize and construct parallel training.*
|
|
|
|
.. toctree::
|
|
:maxdepth: 2
|
|
|
|
colossalai.context.process_group_initializer
|
|
colossalai.context.random
|
|
|
|
|
|
.. toctree::
|
|
:maxdepth: 2
|
|
|
|
colossalai.context.config
|
|
colossalai.context.parallel_context
|
|
colossalai.context.moe_context
|
|
colossalai.context.parallel_mode
|
|
|
|
.. automodule:: colossalai.context
|
|
:members:
|