ColossalAI/colossalai/moe
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239)
* update accelerator

* fix timer

* fix amp

* update

* fix

* update bug

* add error raise

* fix autocast

* fix set device

* remove doc accelerator

* update doc

* update doc

* update doc

* use nullcontext

* update cpu

* update null context

* change time limit for example

* udpate

* update

* update

* update

* [npu] polish accelerator code

---------

Co-authored-by: Xuanlei Zhao <xuanlei.zhao@gmail.com>
Co-authored-by: zxl <43881818+oahzxl@users.noreply.github.com>
2024-01-09 10:20:05 +08:00
..
__init__.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
_operation.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 2023-11-17 10:53:00 +08:00
checkpoint.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
experts.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
layers.py [hotfix]: modify create_ep_hierarchical_group and add test (#5032) 2023-11-17 10:53:00 +08:00
load_balance.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
loss.py [moe] merge moe into main (#4978) 2023-11-02 02:21:24 +00:00
manager.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
routers.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
utils.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00