You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
Hongxin Liu d4a436051d
[checkpointio] support async model save (#6131)
1 week ago
..
__init__.py [shardformer] fix the moe (#5883) 5 months ago
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 10 months ago
gemini_plugin.py [shardformer] optimize seq parallelism (#6086) 2 months ago
hybrid_parallel_plugin.py [plugin] support get_grad_norm (#6115) 3 weeks ago
low_level_zero_plugin.py [checkpointio] support async model save (#6131) 1 week ago
moe_hybrid_parallel_plugin.py [shardformer] optimize seq parallelism (#6086) 2 months ago
plugin_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 7 months ago
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
torch_ddp_plugin.py [checkpointio] support async model save (#6131) 1 week ago
torch_fsdp_plugin.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 3 months ago