ColossalAI/colossalai/checkpoint_io
Wang Binluo 75c963686f
[lora] lora support hybrid parallel plugin (#5956)
* lora support hybrid plugin

* fix

* fix

* fix

* fix
2024-08-02 10:36:58 +08:00
..
__init__.py [MoE/ZeRO] Moe refactor with zero refactor (#5821) 2024-06-28 14:00:08 +08:00
checkpoint_io_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 2024-04-28 10:51:27 +08:00
general_checkpoint_io.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 2024-04-28 10:51:27 +08:00
hybrid_parallel_checkpoint_io.py [lora] lora support hybrid parallel plugin (#5956) 2024-08-02 10:36:58 +08:00
index_file.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
moe_checkpoint.py [moe] implement tp 2024-08-01 10:06:59 +08:00
utils.py [MoE/ZeRO] Moe refactor with zero refactor (#5821) 2024-06-28 14:00:08 +08:00