You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster
Hongxin Liu e86127925a
[plugin] support all-gather overlap for hybrid parallel (#5919)
4 months ago
..
mixed_precision [npu] change device to accelerator api (#5239) 11 months ago
plugin [plugin] support all-gather overlap for hybrid parallel (#5919) 4 months ago
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
accelerator.py [misc] update pre-commit and run all files (#4752) 1 year ago
booster.py [Feature] qlora support (#5586) 7 months ago