ColossalAI/colossalai/booster
Hongxin Liu 261eab02fb [plugin] add 3d parallel plugin (#4295)
* [amp] add mixed precision optimizer

* [plugin] add 3d parallel plugin

* [booster] support pipeline

* [plugin] 3d parallel plugin support clip grad norm

* [shardformer] fix sharder and add plugin test

* [plugin] rename 3d parallel plugin

* [ci] support testmon core pkg change detection (#4305)

* [hotfix] debug testmon

* [hotfix] fix llama

* [hotfix] fix p2p bugs

* [hotfix] fix requirements
2023-08-15 23:25:14 +08:00
..
mixed_precision [NFC] Fix format for mixed precision (#4253) 2023-07-26 14:12:57 +08:00
plugin [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [booster] added the accelerator implementation (#3159) 2023-03-20 13:59:24 +08:00
booster.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00