ColossalAI/tests/test_fp8
Hanks b480eec738
[Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928)
* support fp8_communication in the Torch DDP grad comm, FSDP grad comm, and FSDP params comm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* implement communication hook for FSDP params all-gather

* added unit test for fp8 operators

* support fp8 communication in GeminiPlugin

* update training scripts to support fsdp and fp8 communication

* fixed some minor bugs observed in unit test

* add all_gather_into_tensor_flat_fp8

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add skip the test if torch < 2.2.0

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add skip the test if torch < 2.2.0

* add skip the test if torch < 2.2.0

* add fp8_comm flag

* rebase latest fp8 operators

* rebase latest fp8 operators

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-08 15:55:01 +08:00
..
test_all_to_all_single.py [fp8]support all2all fp8 (#5953) 2024-08-06 16:58:23 +08:00
test_fp8_all_to_all.py [Feature] llama shardformer fp8 support (#5938) 2024-08-05 10:05:47 +08:00
test_fp8_all_to_all_single.py [Feature] llama shardformer fp8 support (#5938) 2024-08-05 10:05:47 +08:00
test_fp8_allgather_flat.py [Feature] llama shardformer fp8 support (#5938) 2024-08-05 10:05:47 +08:00
test_fp8_allreduce.py [Feature] llama shardformer fp8 support (#5938) 2024-08-05 10:05:47 +08:00
test_fp8_cast.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00
test_fp8_ddp_comm_hook.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00
test_fp8_fsdp_comm_hook.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00
test_fp8_gather.py [Feature] llama shardformer fp8 support (#5938) 2024-08-05 10:05:47 +08:00
test_fp8_hook.py [fp8] support fp8 amp for hybrid parallel plugin (#5975) 2024-08-07 18:21:08 +08:00
test_fp8_linear.py [fp8] add fp8 linear (#5967) 2024-08-07 15:41:49 +08:00
test_fp8_reduce_scatter.py [Feature] llama shardformer fp8 support (#5938) 2024-08-05 10:05:47 +08:00