mirror of https://github.com/hpcaitech/ColossalAI
b5f9e37c70
* [legacy] remove outdated codes of pipeline (#4692) * [legacy] remove cli of benchmark and update optim (#4690) * [legacy] remove cli of benchmark and update optim * [doc] fix cli doc test * [legacy] fix engine clip grad norm * [legacy] remove outdated colo tensor (#4694) * [legacy] remove outdated colo tensor * [test] fix test import * [legacy] move outdated zero to legacy (#4696) * [legacy] clean up utils (#4700) * [legacy] clean up utils * [example] update examples * [legacy] clean up amp * [legacy] fix amp module * [legacy] clean up gpc (#4742) * [legacy] clean up context * [legacy] clean core, constants and global vars * [legacy] refactor initialize * [example] fix examples ci * [example] fix examples ci * [legacy] fix tests * [example] fix gpt example * [example] fix examples ci * [devops] fix ci installation * [example] fix examples ci |
||
---|---|---|
.. | ||
add_your_parallel.md | ||
define_your_own_parallel_model.md | ||
integrate_mixture_of_experts_into_your_model.md | ||
meet_gemini.md | ||
opt_service.md | ||
parallelize_your_training_like_Megatron.md | ||
train_gpt_using_hybrid_parallelism.md | ||
train_vit_using_pipeline_parallelism.md | ||
train_vit_with_hybrid_parallelism.md |