You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster
Wenhao Chen d799a3088f
[pipeline]: add p2p fallback order and fix interleaved pp deadlock (#5214)
11 months ago
..
mixed_precision [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
plugin [pipeline]: add p2p fallback order and fix interleaved pp deadlock (#5214) 11 months ago
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
accelerator.py [misc] update pre-commit and run all files (#4752) 1 year ago
booster.py [lazy] support from_pretrained (#4801) 1 year ago