You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
Baizhou Zhang 38ccb8b1a3
[shardformer] support from_pretrained when loading model with HybridParallelPlugin (#4575)
1 year ago
..
__init__.py [plugin] add 3d parallel plugin (#4295) 1 year ago
dp_plugin_base.py
gemini_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago
hybrid_parallel_plugin.py [shardformer] support from_pretrained when loading model with HybridParallelPlugin (#4575) 1 year ago
low_level_zero_plugin.py [zero] support shard optimizer state dict of zero (#4194) 1 year ago
plugin_base.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago
pp_plugin_base.py [plugin] add 3d parallel plugin (#4295) 1 year ago
torch_ddp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago
torch_fsdp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 1 year ago