You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster
duanjunwen 5b5fbcff09
[fix] fix hybridparall use_fp8 config
4 weeks ago
..
mixed_precision [feat] zerobubble support moehybridplugin; 2 months ago
plugin [fix] fix hybridparall use_fp8 config 4 weeks ago
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
accelerator.py [misc] update pre-commit and run all files (#4752) 1 year ago
booster.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago