ColossalAI/colossalai/booster/plugin
flybird11111 0a51319113
[fp8] zero support fp8 linear. (#6006)
* fix

* fix

* fix

* zero fp8

* zero fp8

* Update requirements.txt
2024-08-16 10:13:07 +08:00
..
__init__.py [shardformer] fix the moe (#5883) 2024-07-03 20:02:19 +08:00
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 2024-02-05 15:14:56 +08:00
gemini_plugin.py [fp8] support gemini plugin (#5978) 2024-08-09 14:09:48 +08:00
hybrid_parallel_plugin.py [fp8] support gemini plugin (#5978) 2024-08-09 14:09:48 +08:00
low_level_zero_plugin.py [fp8] zero support fp8 linear. (#6006) 2024-08-16 10:13:07 +08:00
moe_hybrid_parallel_plugin.py [fp8] add use_fp8 option for MoeHybridParallelPlugin (#6009) 2024-08-16 10:12:50 +08:00
plugin_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 2024-04-28 10:51:27 +08:00
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
torch_ddp_plugin.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00
torch_fsdp_plugin.py [Feature]: support FP8 communication in DDP, FSDP, Gemini (#5928) 2024-08-08 15:55:01 +08:00