You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster
Edenzzzz 8795bb2e80
Support 4d parallel + flash attention (#5789)
5 months ago
..
mixed_precision [npu] change device to accelerator api (#5239) 11 months ago
plugin Support 4d parallel + flash attention (#5789) 5 months ago
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
accelerator.py [misc] update pre-commit and run all files (#4752) 1 year ago
booster.py [Feature] qlora support (#5586) 7 months ago