* support fp8_communication in the Torch DDP grad comm, FSDP grad comm, and FSDP params comm
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* implement communication hook for FSDP params all-gather
* added unit test for fp8 operators
* support fp8 communication in GeminiPlugin
* update training scripts to support fsdp and fp8 communication
* fixed some minor bugs observed in unit test
* add all_gather_into_tensor_flat_fp8
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* add skip the test if torch < 2.2.0
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* add skip the test if torch < 2.2.0
* add skip the test if torch < 2.2.0
* add fp8_comm flag
* rebase latest fp8 operators
* rebase latest fp8 operators
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>