You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
Hongxin Liu 641b1ee71a
[devops] remove post commit ci (#5566)
8 months ago
..
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 10 months ago
gemini_plugin.py [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 9 months ago
hybrid_parallel_plugin.py [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
low_level_zero_plugin.py [npu] change device to accelerator api (#5239) 11 months ago
moe_hybrid_parallel_plugin.py [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
torch_ddp_plugin.py [doc] polish shardformer doc (#4779) 1 year ago
torch_fsdp_plugin.py [devops] remove post commit ci (#5566) 8 months ago