ColossalAI/colossalai/booster
botbw 8e718a1421
[gemini] fixes for benchmarking (#5847)
* [gemini] fix missing return

* [gemini] fix missing arg pass

* [gemini] use gather tensor instead of list

* [test] enable flash attention for benchmark by default

* [test] enable flash attention for benchmark by default

---------

Co-authored-by: genghaozhe <939857490@qq.com>
2024-06-26 15:52:09 +08:00
..
mixed_precision [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
plugin [gemini] fixes for benchmarking (#5847) 2024-06-26 15:52:09 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
booster.py [Feature] qlora support (#5586) 2024-04-28 10:51:27 +08:00