mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
16 lines
359 B
16 lines
359 B
colossalai.nn.optimizer |
|
======================= |
|
|
|
.. automodule:: colossalai.nn.optimizer |
|
:members: |
|
|
|
|
|
.. toctree:: |
|
:maxdepth: 2 |
|
|
|
colossalai.nn.optimizer.colossalai_optimizer |
|
colossalai.nn.optimizer.fused_adam |
|
colossalai.nn.optimizer.fused_lamb |
|
colossalai.nn.optimizer.fused_sgd |
|
colossalai.nn.optimizer.lamb |
|
colossalai.nn.optimizer.lars
|
|
|