mirror of https://github.com/hpcaitech/ColossalAI
c54c4fcd15
* [example] pass use_fp8_comm flag to all plugins * [example] add mixtral benchmark * [moe] refine assertion and check * [moe] fix mixtral & add more tests * [moe] consider checking dp * sp group and moe_dp_group * [mixtral] remove gate tp & add more tests * [deepseek] fix tp & sp for deepseek * [mixtral] minor fix * [deepseek] add deepseek benchmark |
||
---|---|---|
.. | ||
moe_utils.py | ||
test_deepseek_layer.py | ||
test_kernel.py | ||
test_mixtral_layer.py | ||
test_moe_checkpoint.py | ||
test_moe_ep_tp.py | ||
test_moe_ep_zero.py |