ColossalAI/tests/test_moe
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
* [example] pass use_fp8_comm flag to all plugins

* [example] add mixtral benchmark

* [moe] refine assertion and check

* [moe] fix mixtral & add more tests

* [moe] consider checking dp * sp group and moe_dp_group

* [mixtral] remove gate tp & add more tests

* [deepseek] fix tp & sp for deepseek

* [mixtral] minor fix

* [deepseek] add deepseek benchmark
2024-09-10 17:30:53 +08:00
..
moe_utils.py [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2024-09-10 17:30:53 +08:00
test_deepseek_layer.py [moe] refactor mesh assignment 2024-08-01 10:06:59 +08:00
test_kernel.py [moe] full test for deepseek and mixtral (pp + sp to fix) 2024-08-01 10:06:59 +08:00
test_mixtral_layer.py [moe] refactor mesh assignment 2024-08-01 10:06:59 +08:00
test_moe_checkpoint.py [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2024-09-10 17:30:53 +08:00
test_moe_ep_tp.py [chore] solve moe ckpt test failure and some other arg pass failure 2024-08-01 10:06:59 +08:00
test_moe_ep_zero.py [chore] solve moe ckpt test failure and some other arg pass failure 2024-08-01 10:06:59 +08:00