Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 

23 lines
439 B

colossalai.utils
================
.. automodule:: colossalai.utils
:members:
.. toctree::
:maxdepth: 2
colossalai.utils.data_sampler
colossalai.utils.gradient_accumulation
colossalai.utils.multi_tensor_apply
.. toctree::
:maxdepth: 2
colossalai.utils.activation_checkpoint
colossalai.utils.checkpointing
colossalai.utils.common
colossalai.utils.cuda
colossalai.utils.memory
colossalai.utils.timer