ColossalAI/colossalai/booster/plugin
botbw 023ea13cb5
Merge pull request #5749 from hpcaitech/prefetch
[Gemini] Prefetch next chunk before each op
2024-05-29 15:35:54 +08:00
..
__init__.py
dp_plugin_base.py
gemini_plugin.py Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch 2024-05-24 04:05:07 +00:00
hybrid_parallel_plugin.py [Feature] auto-cast optimizers to distributed version (#5746) 2024-05-24 17:24:16 +08:00
low_level_zero_plugin.py [Feature] auto-cast optimizers to distributed version (#5746) 2024-05-24 17:24:16 +08:00
moe_hybrid_parallel_plugin.py [shardformer] Sequence Parallelism Optimization (#5533) 2024-04-03 17:15:47 +08:00
plugin_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 2024-04-28 10:51:27 +08:00
pp_plugin_base.py
torch_ddp_plugin.py [misc] refactor launch API and tensor constructor (#5666) 2024-04-29 10:40:11 +08:00
torch_fsdp_plugin.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 2024-04-28 10:51:27 +08:00