ColossalAI/colossalai/booster/plugin
Hongxin Liu 6c0fa7b9a8
[llama] fix dataloader for hybrid parallel (#5358)
* [plugin] refactor prepare dataloader

* [plugin] update train script
2024-02-05 15:14:56 +08:00
..
__init__.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 2024-02-05 15:14:56 +08:00
gemini_plugin.py [llama] fix dataloader for hybrid parallel (#5358) 2024-02-05 15:14:56 +08:00
hybrid_parallel_plugin.py [llama] fix dataloader for hybrid parallel (#5358) 2024-02-05 15:14:56 +08:00
low_level_zero_plugin.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
moe_hybrid_parallel_plugin.py [moe] support optimizer checkpoint (#5015) 2023-11-08 15:07:03 +00:00
plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
torch_ddp_plugin.py [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00
torch_fsdp_plugin.py [doc] polish shardformer doc (#4779) 2023-09-26 10:57:47 +08:00