You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
LuGY 1a49a5ea00
[zero] support shard optimizer state dict of zero (#4194)
1 year ago
..
__init__.py [booster] support torch fsdp plugin in booster (#3697) 2 years ago
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2 years ago
gemini_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago
low_level_zero_plugin.py [zero] support shard optimizer state dict of zero (#4194) 1 year ago
plugin_base.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago
torch_ddp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago
torch_fsdp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago