ColossalAI/colossalai/booster/plugin
Baizhou Zhang 44eab2b27f
[shardformer] support sharded checkpoint IO for models of HybridParallelPlugin (#4506)
* add APIs

* implement save_sharded_model

* add test for hybrid checkpointio

* implement naive loading for sharded model

* implement efficient sharded model loading

* open a new file for hybrid checkpoint_io

* small fix

* fix circular importing

* fix docstring

* arrange arguments and apis

* small fix
2023-08-25 22:04:57 +08:00
..
__init__.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
dp_plugin_base.py [booster] update prepare dataloader method for plugin (#3706) 2023-05-08 15:44:03 +08:00
gemini_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
hybrid_parallel_plugin.py [shardformer] support sharded checkpoint IO for models of HybridParallelPlugin (#4506) 2023-08-25 22:04:57 +08:00
low_level_zero_plugin.py [zero] support shard optimizer state dict of zero (#4194) 2023-07-31 22:13:29 +08:00
plugin_base.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
pp_plugin_base.py [plugin] add 3d parallel plugin (#4295) 2023-08-15 23:25:14 +08:00
torch_ddp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00
torch_fsdp_plugin.py [zero]support no_sync method for zero1 plugin (#4138) 2023-07-31 22:13:29 +08:00