ColossalAI/colossalai/booster
Baizhou Zhang 44eab2b27f
[shardformer] support sharded checkpoint IO for models of HybridParallelPlugin (#4506)
* add APIs

* implement save_sharded_model

* add test for hybrid checkpointio

* implement naive loading for sharded model

* implement efficient sharded model loading

* open a new file for hybrid checkpoint_io

* small fix

* fix circular importing

* fix docstring

* arrange arguments and apis

* small fix
2023-08-25 22:04:57 +08:00
..
mixed_precision [NFC] Fix format for mixed precision (#4253) 2023-07-26 14:12:57 +08:00
plugin [shardformer] support sharded checkpoint IO for models of HybridParallelPlugin (#4506) 2023-08-25 22:04:57 +08:00
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
accelerator.py [booster] added the accelerator implementation (#3159) 2023-03-20 13:59:24 +08:00
booster.py [misc] resolve code factor issues (#4433) 2023-08-15 23:25:14 +08:00