ColossalAI/tests/test_moe
Xuanlei Zhao 7d8e0338a4 [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
..
moe_utils.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
test_grad_handler.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_kernel.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_checkpoint.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_ep_tp.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_group.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
test_moe_hybrid_zero.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
test_moe_load_balance.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
test_moe_router.py [moe]: fix ep/tp tests, add hierarchical all2all (#4982) 2023-11-09 06:31:00 +00:00
test_moe_zero_fwd_bwd.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00
test_moe_zero_optim.py [moe] init mixtral impl 2024-02-07 19:21:02 +08:00