ColossalAI/colossalai/booster/plugin
flybird11111 3e02154710
[gemini] gemini support extra-dp (#5043)
* support ddp

* fix

* fix

* fix

fix

* support ddp

* fix

* fix

* fix

fix

* simplify tests

* fix

* fix

* fix

fix

fix

* fix
2023-11-16 21:03:04 +08:00
..
__init__.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
dp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
gemini_plugin.py [gemini] gemini support extra-dp (#5043) 2023-11-16 21:03:04 +08:00
hybrid_parallel_plugin.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
low_level_zero_plugin.py [hotfix] fix the bug of repeatedly storing param group (#4951) 2023-10-31 14:48:01 +08:00
moe_hybrid_parallel_plugin.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
torch_ddp_plugin.py [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00
torch_fsdp_plugin.py [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00