ColossalAI/colossalai/booster
botbw c54c4fcd15
[hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048)
* [example] pass use_fp8_comm flag to all plugins

* [example] add mixtral benchmark

* [moe] refine assertion and check

* [moe] fix mixtral & add more tests

* [moe] consider checking dp * sp group and moe_dp_group

* [mixtral] remove gate tp & add more tests

* [deepseek] fix tp & sp for deepseek

* [mixtral] minor fix

* [deepseek] add deepseek benchmark
2024-09-10 17:30:53 +08:00
..
mixed_precision [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
plugin [hotfix] moe hybrid parallelism benchmark & follow-up fix (#6048) 2024-09-10 17:30:53 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
booster.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 2024-08-22 09:21:34 +08:00