ColossalAI/colossalai/booster
Baizhou Zhang 0ceec8f9a9 [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354)
* add naive optimizer for 3DPlugin/refactor gpt2 shardformer test

* merge tests of PP/DP/TP combinations into one test file

* fix bug when sync grad for dp in HybridPlugin

* update supported precisions for 3DPlugin/fix bug when shifting tp_degree

* improve the passing of lazy_init

* modify lazy_init/use sync_shared_params
2023-08-15 23:25:14 +08:00
..
mixed_precision [NFC] Fix format for mixed precision (#4253) 2023-07-26 14:12:57 +08:00
plugin [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 2023-08-15 23:25:14 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [booster] added the accelerator implementation (#3159) 2023-03-20 13:59:24 +08:00
booster.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00