ColossalAI/colossalai/booster/plugin
Wenhao Chen 725af3eeeb
[booster] make optimizer argument optional for boost (#3993)
* feat: make optimizer optional in Booster.boost

* test: skip unet test if diffusers version > 0.10.2
2023-06-15 17:38:42 +08:00
..
__init__.py [booster] support torch fsdp plugin in booster (#3697) 2023-05-15 12:14:38 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00
low_level_zero_plugin.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00
plugin_base.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00
torch_ddp_plugin.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00
torch_fsdp_plugin.py [booster] make optimizer argument optional for boost (#3993) 2023-06-15 17:38:42 +08:00