ColossalAI/colossalai/booster/plugin
Frank Lee bd1ab98158
[gemini] fixed the gemini checkpoint io (#3934)
2023-06-09 09:48:49 +08:00
..
__init__.py [booster] support torch fsdp plugin in booster (#3697) 2023-05-15 12:14:38 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py [gemini] fixed the gemini checkpoint io (#3934) 2023-06-09 09:48:49 +08:00
low_level_zero_plugin.py [bf16] add bf16 support (#3882) 2023-06-05 15:58:31 +08:00
plugin_base.py [booster] fix no_sync method (#3709) 2023-05-09 11:10:02 +08:00
torch_ddp_plugin.py [plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2023-05-18 20:05:59 +08:00
torch_fsdp_plugin.py [booster] add warning for torch fsdp plugin doc (#3833) 2023-05-25 14:00:02 +08:00