ColossalAI/tests/test_moe
Hongxin Liu d202cc28c0
[npu] change device to accelerator api (#5239)
* update accelerator

* fix timer

* fix amp

* update

* fix

* update bug

* add error raise

* fix autocast

* fix set device

* remove doc accelerator

* update doc

* update doc

* update doc

* use nullcontext

* update cpu

* update null context

* change time limit for example

* udpate

* update

* update

* update

* [npu] polish accelerator code

---------

Co-authored-by: Xuanlei Zhao <xuanlei.zhao@gmail.com>
Co-authored-by: zxl <43881818+oahzxl@users.noreply.github.com>
2024-01-09 10:20:05 +08:00
..
moe_utils.py [moe]: fix ep/tp tests, add hierarchical all2all (#4982) 2023-11-09 06:31:00 +00:00
test_grad_handler.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_kernel.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_checkpoint.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_ep_tp.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_group.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_hybrid_zero.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
test_moe_load_balance.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
test_moe_router.py [moe]: fix ep/tp tests, add hierarchical all2all (#4982) 2023-11-09 06:31:00 +00:00
test_moe_zero_fwd_bwd.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
test_moe_zero_optim.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00