Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 

11 lines
210 B

colossalai.nn.layer.parallel\_2p5d
==================================
.. automodule:: colossalai.nn.layer.parallel_2p5d
:members:
.. toctree::
:maxdepth: 2
colossalai.nn.layer.parallel_2p5d.layers