mirror of https://github.com/hpcaitech/ColossalAI
d202cc28c0
* update accelerator * fix timer * fix amp * update * fix * update bug * add error raise * fix autocast * fix set device * remove doc accelerator * update doc * update doc * update doc * use nullcontext * update cpu * update null context * change time limit for example * udpate * update * update * update * [npu] polish accelerator code --------- Co-authored-by: Xuanlei Zhao <xuanlei.zhao@gmail.com> Co-authored-by: zxl <43881818+oahzxl@users.noreply.github.com> |
||
---|---|---|
.. | ||
integrate_mixture_of_experts_into_your_model.md | ||
meet_gemini.md | ||
opt_service.md | ||
train_gpt_using_hybrid_parallelism.md | ||
train_vit_with_hybrid_parallelism.md |