You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
Hongxin Liu e86127925a
[plugin] support all-gather overlap for hybrid parallel (#5919)
4 months ago
..
__init__.py
dp_plugin_base.py
gemini_plugin.py
hybrid_parallel_plugin.py [plugin] support all-gather overlap for hybrid parallel (#5919) 4 months ago
low_level_zero_plugin.py [plugin] support all-gather overlap for hybrid parallel (#5919) 4 months ago
moe_hybrid_parallel_plugin.py
plugin_base.py
pp_plugin_base.py
torch_ddp_plugin.py
torch_fsdp_plugin.py