You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/docs/source/en/features
Guangyao Zhang bdb125f83f
[doc] FP8 training and communication document (#6050)
3 months ago
..
1D_tensor_parallel.md [doc] clean up outdated docs (#4765) 1 year ago
2D_tensor_parallel.md [doc] clean up outdated docs (#4765) 1 year ago
2p5D_tensor_parallel.md [doc] clean up outdated docs (#4765) 1 year ago
3D_tensor_parallel.md [doc] clean up outdated docs (#4765) 1 year ago
cluster_utils.md [doc] add booster docstring and fix autodoc (#3789) 2 years ago
distributed_optimizers.md [doc] Update llama + sp compatibility; fix dist optim table 5 months ago
gradient_accumulation_with_booster.md [misc] refactor launch API and tensor constructor (#5666) 7 months ago
gradient_clipping_with_booster.md [misc] refactor launch API and tensor constructor (#5666) 7 months ago
lazy_init.md [misc] refactor launch API and tensor constructor (#5666) 7 months ago
mixed_precision_training_with_booster.md [doc] FP8 training and communication document (#6050) 3 months ago
nvme_offload.md [misc] refactor launch API and tensor constructor (#5666) 7 months ago
pipeline_parallel.md [hotfix] set return_outputs=False in examples and polish code (#5404) 8 months ago
sequence_parallelism.md [doc] update sp doc (#6055) 3 months ago
shardformer.md [doc] Update llama + sp compatibility; fix dist optim table 5 months ago
zero_with_chunk.md [misc] refactor launch API and tensor constructor (#5666) 7 months ago