ColossalAI/colossalai/booster
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
* support tp + sp + pp

* remove comments

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-06-17 17:40:47 +08:00
..
mixed_precision [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
plugin Support 4d parallel + flash attention (#5789) 2024-06-17 17:40:47 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
booster.py [Feature] qlora support (#5586) 2024-04-28 10:51:27 +08:00