mirror of https://github.com/hpcaitech/ColossalAI
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
22 lines
929 B
22 lines
929 B
colossalai.context.process\_group\_initializer
|
|
==============================================
|
|
|
|
.. automodule:: colossalai.context.process_group_initializer
|
|
:members:
|
|
|
|
|
|
.. toctree::
|
|
:maxdepth: 2
|
|
|
|
colossalai.context.process_group_initializer.initializer_1d
|
|
colossalai.context.process_group_initializer.initializer_2d
|
|
colossalai.context.process_group_initializer.initializer_2p5d
|
|
colossalai.context.process_group_initializer.initializer_3d
|
|
colossalai.context.process_group_initializer.initializer_data
|
|
colossalai.context.process_group_initializer.initializer_model
|
|
colossalai.context.process_group_initializer.initializer_moe
|
|
colossalai.context.process_group_initializer.initializer_pipeline
|
|
colossalai.context.process_group_initializer.initializer_sequence
|
|
colossalai.context.process_group_initializer.initializer_tensor
|
|
colossalai.context.process_group_initializer.process_group_initializer
|