Making large AI models cheaper, faster and more accessible
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
4 months ago
..
__init__.py [shardformer] fix the moe (#5883) 5 months ago
dp_plugin_base.py
gemini_plugin.py
hybrid_parallel_plugin.py [lora] lora support hybrid parallel plugin (#5956) 4 months ago
low_level_zero_plugin.py [chore] solve moe ckpt test failure and some other arg pass failure 4 months ago
moe_hybrid_parallel_plugin.py [moe] solve dp axis issue 4 months ago
plugin_base.py
pp_plugin_base.py
torch_ddp_plugin.py
torch_fsdp_plugin.py