ColossalAI/colossalai/booster/plugin
digger yu 7f8203af69
fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808)
2023-05-24 09:01:50 +08:00
..
__init__.py [booster] support torch fsdp plugin in booster (#3697) 2023-05-15 12:14:38 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py fix typo colossalai/auto_parallel autochunk fx/passes etc. (#3808) 2023-05-24 09:01:50 +08:00
low_level_zero_plugin.py [plugin] a workaround for zero plugins' optimizer checkpoint (#3780) 2023-05-19 19:42:31 +08:00
plugin_base.py [booster] fix no_sync method (#3709) 2023-05-09 11:10:02 +08:00
torch_ddp_plugin.py [plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2023-05-18 20:05:59 +08:00
torch_fsdp_plugin.py [booster] torch fsdp fix ckpt (#3788) 2023-05-23 16:58:45 +08:00