ColossalAI/colossalai/booster/plugin
Baizhou Zhang 0ceec8f9a9 [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354)
* add naive optimizer for 3DPlugin/refactor gpt2 shardformer test

* merge tests of PP/DP/TP combinations into one test file

* fix bug when sync grad for dp in HybridPlugin

* update supported precisions for 3DPlugin/fix bug when shifting tp_degree

* improve the passing of lazy_init

* modify lazy_init/use sync_shared_params
2023-08-15 23:25:14 +08:00
..
__init__.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
hybrid_parallel_plugin.py [pipeline] support fp32 for HybridPlugin/merge shardformer test and pipeline test into one file (#4354) 2023-08-15 23:25:14 +08:00
low_level_zero_plugin.py [zero] support shard optimizer state dict of zero (#4194) 2023-07-31 22:13:29 +08:00
plugin_base.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
pp_plugin_base.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
torch_ddp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
torch_fsdp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00