You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster/plugin
hxwang ff507b755e
Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch
6 months ago
..
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 10 months ago
gemini_plugin.py Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch 6 months ago
hybrid_parallel_plugin.py [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
low_level_zero_plugin.py [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
moe_hybrid_parallel_plugin.py [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
plugin_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 7 months ago
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 1 year ago
torch_ddp_plugin.py [misc] refactor launch API and tensor constructor (#5666) 7 months ago
torch_fsdp_plugin.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 7 months ago