mirror of https://github.com/hpcaitech/ColossalAI
b480eec738
* support fp8_communication in the Torch DDP grad comm, FSDP grad comm, and FSDP params comm * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * implement communication hook for FSDP params all-gather * added unit test for fp8 operators * support fp8 communication in GeminiPlugin * update training scripts to support fsdp and fp8 communication * fixed some minor bugs observed in unit test * add all_gather_into_tensor_flat_fp8 * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * add skip the test if torch < 2.2.0 * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * add skip the test if torch < 2.2.0 * add skip the test if torch < 2.2.0 * add fp8_comm flag * rebase latest fp8 operators * rebase latest fp8 operators * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
kit | ||
test_analyzer | ||
test_auto_parallel | ||
test_autochunk | ||
test_booster | ||
test_checkpoint_io | ||
test_cluster | ||
test_config | ||
test_device | ||
test_fp8 | ||
test_fx | ||
test_infer | ||
test_lazy | ||
test_legacy | ||
test_lora | ||
test_moe | ||
test_optimizer | ||
test_pipeline | ||
test_shardformer | ||
test_smoothquant | ||
test_tensor | ||
test_zero | ||
__init__.py |