ColossalAI/colossalai/shardformer/shard
Hongxin Liu 261eab02fb [plugin] add 3d parallel plugin (#4295)
* [amp] add mixed precision optimizer

* [plugin] add 3d parallel plugin

* [booster] support pipeline

* [plugin] 3d parallel plugin support clip grad norm

* [shardformer] fix sharder and add plugin test

* [plugin] rename 3d parallel plugin

* [ci] support testmon core pkg change detection (#4305)

* [hotfix] debug testmon

* [hotfix] fix llama

* [hotfix] fix p2p bugs

* [hotfix] fix requirements
2023-08-15 23:25:14 +08:00
..
__init__.py [shardformer] Refactor shardformer api (#4001) 2023-07-04 16:05:01 +08:00
shard_config.py [shardformer] fix type hint 2023-08-15 23:25:14 +08:00
sharder.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
shardformer.py [shardformer] rename policy file name 2023-08-15 23:25:14 +08:00
utils.py [pipeline] update shardformer policy 2023-08-15 23:25:14 +08:00