mirror of https://github.com/hpcaitech/ColossalAI
![]() * [gemini] fix missing return * [gemini] fix missing arg pass * [gemini] use gather tensor instead of list * [test] enable flash attention for benchmark by default * [test] enable flash attention for benchmark by default --------- Co-authored-by: genghaozhe <939857490@qq.com> |
||
---|---|---|
.. | ||
__init__.py | ||
dp_plugin_base.py | ||
gemini_plugin.py | ||
hybrid_parallel_plugin.py | ||
low_level_zero_plugin.py | ||
moe_hybrid_parallel_plugin.py | ||
plugin_base.py | ||
pp_plugin_base.py | ||
torch_ddp_plugin.py | ||
torch_fsdp_plugin.py |