You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/checkpoint_io
Hongxin Liu e86127925a
[plugin] support all-gather overlap for hybrid parallel (#5919)
4 months ago
..
__init__.py [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
checkpoint_io_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 7 months ago
general_checkpoint_io.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 7 months ago
hybrid_parallel_checkpoint_io.py [plugin] support all-gather overlap for hybrid parallel (#5919) 4 months ago
index_file.py [misc] update pre-commit and run all files (#4752) 1 year ago
moe_checkpoint.py [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago
utils.py [MoE/ZeRO] Moe refactor with zero refactor (#5821) 5 months ago