mirror of https://github.com/hpcaitech/ColossalAI
![]() * [example] pass use_fp8_comm flag to all plugins * [example] add mixtral benchmark * [moe] refine assertion and check * [moe] fix mixtral & add more tests * [moe] consider checking dp * sp group and moe_dp_group * [mixtral] remove gate tp & add more tests * [deepseek] fix tp & sp for deepseek * [mixtral] minor fix * [deepseek] add deepseek benchmark |
||
---|---|---|
.. | ||
bert | ||
commons | ||
deepseek | ||
gpt | ||
grok-1 | ||
llama | ||
mixtral | ||
opt | ||
palm | ||
__init__.py | ||
data_utils.py | ||
model_utils.py | ||
performance_evaluator.py |