ColossalAI/colossalai/booster/plugin
Hongxin Liu 3c07a2846e
[plugin] a workaround for zero plugins' optimizer checkpoint (#3780)
* [test] refactor torch ddp checkpoint test

* [plugin] update low level zero optim checkpoint

* [plugin] update gemini optim checkpoint
2023-05-19 19:42:31 +08:00
..
__init__.py [booster] support torch fsdp plugin in booster (#3697) 2023-05-15 12:14:38 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py [plugin] a workaround for zero plugins' optimizer checkpoint (#3780) 2023-05-19 19:42:31 +08:00
low_level_zero_plugin.py [plugin] a workaround for zero plugins' optimizer checkpoint (#3780) 2023-05-19 19:42:31 +08:00
plugin_base.py [booster] fix no_sync method (#3709) 2023-05-09 11:10:02 +08:00
torch_ddp_plugin.py [plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2023-05-18 20:05:59 +08:00
torch_fsdp_plugin.py [booster] support torch fsdp plugin in booster (#3697) 2023-05-15 12:14:38 +08:00