ColossalAI/colossalai/booster/plugin
Hongxin Liu cf519dac6a
[optim] hotfix adam load (#6146)
* [optim] hotfix adam load

* [checkpointio] fix optimizer async io

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [checkpointio] update test

* [checkpointio] update test

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-11-20 16:36:37 +08:00
..
__init__.py [shardformer] fix the moe (#5883) 2024-07-03 20:02:19 +08:00
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 2024-02-05 15:14:56 +08:00
gemini_plugin.py [async io]supoort async io (#6137) 2024-11-19 14:51:39 +08:00
hybrid_parallel_plugin.py [Zerobubble] merge main. (#6142) 2024-11-19 19:00:36 +08:00
low_level_zero_plugin.py [optim] hotfix adam load (#6146) 2024-11-20 16:36:37 +08:00
moe_hybrid_parallel_plugin.py [Zerobubble] merge main. (#6142) 2024-11-19 19:00:36 +08:00
plugin_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 2024-04-28 10:51:27 +08:00
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
torch_ddp_plugin.py [async io]supoort async io (#6137) 2024-11-19 14:51:39 +08:00
torch_fsdp_plugin.py [async io]supoort async io (#6137) 2024-11-19 14:51:39 +08:00