You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
Guangyao Zhang bdb125f83f
[doc] FP8 training and communication document (#6050)
2 months ago
..
__init__.py [shardformer] fix the moe (#5883) 5 months ago
dp_plugin_base.py
gemini_plugin.py [doc] FP8 training and communication document (#6050) 2 months ago
hybrid_parallel_plugin.py [doc] FP8 training and communication document (#6050) 2 months ago
low_level_zero_plugin.py [doc] FP8 training and communication document (#6050) 2 months ago
moe_hybrid_parallel_plugin.py [doc] FP8 training and communication document (#6050) 2 months ago
plugin_base.py
pp_plugin_base.py
torch_ddp_plugin.py [doc] FP8 training and communication document (#6050) 2 months ago
torch_fsdp_plugin.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago