You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_moe
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
2 months ago
..
moe_utils.py [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
test_deepseek_layer.py [moe] refactor mesh assignment 4 months ago
test_kernel.py [moe] full test for deepseek and mixtral (pp + sp to fix) 4 months ago
test_mixtral_layer.py [moe] refactor mesh assignment 4 months ago
test_moe_checkpoint.py [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2 months ago
test_moe_ep_tp.py [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago
test_moe_ep_zero.py [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago