ColossalAI/docs/source/en/features
Hongxin Liu 58d8b8a2dd
[misc] fit torch api upgradation and remove legecy import (#6093)
* [amp] fit torch's new api

* [amp] fix api call

* [amp] fix api call

* [misc] fit torch pytree api upgrade

* [misc] remove legacy import

* [misc] fit torch amp api

* [misc] fit torch amp api
2024-10-18 16:48:52 +08:00
..
1D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
2D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
2p5D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
3D_tensor_parallel.md [doc] clean up outdated docs (#4765) 2023-09-21 11:36:20 +08:00
cluster_utils.md [doc] add booster docstring and fix autodoc (#3789) 2023-05-22 10:56:47 +08:00
distributed_optimizers.md [doc] Update llama + sp compatibility; fix dist optim table 2024-07-01 17:07:22 +08:00
gradient_accumulation_with_booster.md [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
gradient_clipping_with_booster.md [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
lazy_init.md [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
mixed_precision_training_with_booster.md [misc] fit torch api upgradation and remove legecy import (#6093) 2024-10-18 16:48:52 +08:00
nvme_offload.md [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
pipeline_parallel.md [hotfix] set return_outputs=False in examples and polish code (#5404) 2024-03-25 12:31:09 +08:00
sequence_parallelism.md [doc] update sp doc (#6055) 2024-09-11 17:25:14 +08:00
shardformer.md [doc] Update llama + sp compatibility; fix dist optim table 2024-07-01 17:07:22 +08:00
zero_with_chunk.md [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00