mirror of https://github.com/hpcaitech/ColossalAI
![]() * [fp8] refactor hook * [fp8] support gemini plugin * [example] add fp8 option for llama benchmark |
||
---|---|---|
.. | ||
test_all_to_all_single.py | ||
test_fp8_all_to_all.py | ||
test_fp8_all_to_all_single.py | ||
test_fp8_allgather_flat.py | ||
test_fp8_allreduce.py | ||
test_fp8_cast.py | ||
test_fp8_ddp_comm_hook.py | ||
test_fp8_fsdp_comm_hook.py | ||
test_fp8_gather.py | ||
test_fp8_hook.py | ||
test_fp8_linear.py | ||
test_fp8_reduce_scatter.py |