ColossalAI/docs/source/en/advanced_tutorials
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239)
* update accelerator

* fix timer

* fix amp

* update

* fix

* update bug

* add error raise

* fix autocast

* fix set device

* remove doc accelerator

* update doc

* update doc

* update doc

* use nullcontext

* update cpu

* update null context

* change time limit for example

* udpate

* update

* update

* update

* [npu] polish accelerator code

---------

Co-authored-by: Xuanlei Zhao <xuanlei.zhao@gmail.com>
Co-authored-by: zxl <43881818+oahzxl@users.noreply.github.com>
2024-01-09 10:20:05 +08:00
..
integrate_mixture_of_experts_into_your_model.md [doc]update moe chinese document. (#3890) 2023-06-05 15:57:54 +08:00
meet_gemini.md fix typo docs/ (#4033) 2023-06-28 15:30:30 +08:00
opt_service.md fix typo docs/ 2023-05-24 13:57:43 +08:00
train_gpt_using_hybrid_parallelism.md [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
train_vit_with_hybrid_parallelism.md [doc] update advanced tutorials, training gpt with hybrid parallelism (#4866) 2023-10-10 08:18:55 +00:00