ColossalAI/colossalai/booster/plugin
wukong1992 3229f93e30
[booster] add warning for torch fsdp plugin doc (#3833)
2023-05-25 14:00:02 +08:00
..
__init__.py [booster] support torch fsdp plugin in booster (#3697) 2023-05-15 12:14:38 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
low_level_zero_plugin.py [plugin] a workaround for zero plugins' optimizer checkpoint (#3780) 2023-05-19 19:42:31 +08:00
plugin_base.py [booster] fix no_sync method (#3709) 2023-05-09 11:10:02 +08:00
torch_ddp_plugin.py [plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2023-05-18 20:05:59 +08:00
torch_fsdp_plugin.py [booster] add warning for torch fsdp plugin doc (#3833) 2023-05-25 14:00:02 +08:00