You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster
Hongxin Liu 26e553937b
[fp8] fix linear hook (#6046)
3 months ago
..
mixed_precision [npu] change device to accelerator api (#5239) 11 months ago
plugin [fp8] fix linear hook (#6046) 3 months ago
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
accelerator.py [misc] update pre-commit and run all files (#4752) 1 year ago
booster.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago