ColossalAI/tests/test_fp8
Guangyao Zhang f20b066c59
[fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059)
* all_gather only internode, fix pytest

* fix cuda arch <89 compile pytest error

* fix pytest failure

* disable all_gather_into_tensor_flat_fp8

* fix fp8 format

* fix pytest

* fix conversations

* fix chunk tuple to list
2024-09-14 10:40:01 +08:00
..
test_all_to_all_single.py [fp8] support asynchronous FP8 communication (#5997) 2024-08-14 14:08:19 +08:00
test_fp8_all_to_all.py [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2024-09-14 10:40:01 +08:00
test_fp8_all_to_all_single.py [Feature] llama shardformer fp8 support (#5938) 2024-08-05 10:05:47 +08:00
test_fp8_allgather.py [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2024-09-14 10:40:01 +08:00
test_fp8_allreduce.py [fp8] support asynchronous FP8 communication (#5997) 2024-08-14 14:08:19 +08:00
test_fp8_cast.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00
test_fp8_ddp_comm_hook.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00
test_fp8_fsdp_comm_hook.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00
test_fp8_hook.py [fp8] support gemini plugin (#5978) 2024-08-09 14:09:48 +08:00
test_fp8_linear.py [fp8] add fp8 linear (#5967) 2024-08-07 15:41:49 +08:00
test_fp8_reduce_scatter.py [fp8]update reduce-scatter test (#6002) 2024-08-15 14:40:54 +08:00