You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
Hongxin Liu 1b387ca9fe
[shardformer] refactor pipeline grad ckpt config (#5646)
7 months ago
..
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 10 months ago
gemini_plugin.py [exampe] update llama example (#5626) 7 months ago
hybrid_parallel_plugin.py [shardformer] refactor pipeline grad ckpt config (#5646) 7 months ago
low_level_zero_plugin.py [npu] change device to accelerator api (#5239) 11 months ago
moe_hybrid_parallel_plugin.py [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
torch_ddp_plugin.py [doc] polish shardformer doc (#4779) 1 year ago
torch_fsdp_plugin.py [devops] remove post commit ci (#5566) 8 months ago