ColossalAI/colossalai/booster/plugin
LuGY 1a49a5ea00 [zero] support shard optimizer state dict of zero (#4194)
* support shard optimizer of zero

* polish code

* support sync grad manually
2023-07-31 22:13:29 +08:00
..
__init__.py [booster] support torch fsdp plugin in booster (#3697) 2023-05-15 12:14:38 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
low_level_zero_plugin.py [zero] support shard optimizer state dict of zero (#4194) 2023-07-31 22:13:29 +08:00
plugin_base.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
torch_ddp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
torch_fsdp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00