ColossalAI/colossalai/pipeline
Hongxin Liu 261eab02fb [plugin] add 3d parallel plugin (#4295)
* [amp] add mixed precision optimizer

* [plugin] add 3d parallel plugin

* [booster] support pipeline

* [plugin] 3d parallel plugin support clip grad norm

* [shardformer] fix sharder and add plugin test

* [plugin] rename 3d parallel plugin

* [ci] support testmon core pkg change detection (#4305)

* [hotfix] debug testmon

* [hotfix] fix llama

* [hotfix] fix p2p bugs

* [hotfix] fix requirements
2023-08-15 23:25:14 +08:00
..
middleware [Pipeline Middleware] Adapt scheduler for Topo (#2066) 2022-12-05 20:23:41 +08:00
policy [pipeline] move bert related pipeline components to shardformer (#4187) 2023-08-15 23:25:14 +08:00
rpc [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
schedule [pipeline] All bert models (#4233) 2023-08-15 23:25:14 +08:00
__init__.py fix file name (#1759) 2022-10-25 16:48:48 +08:00
layer_spec.py fix file name (#1759) 2022-10-25 16:48:48 +08:00
p2p.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
pipelinable.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
pipeline_process_group.py [pipeline/fix-bug] num_microbatches support any integrate | stable chimera | launch tool for rpc pp framework (#1684) 2022-10-10 16:01:02 +08:00
stage_manager.py [pipeline] add stage manager (#4093) 2023-08-15 23:25:14 +08:00
utils.py [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00