mirror of https://github.com/hpcaitech/ColossalAI
![]() * implement sharded optimizer saving * add more param info * finish implementation of sharded optimizer saving * fix bugs in optimizer sharded saving * add pp+zero test * param group loading * greedy loading of optimizer * fix bug when loading * implement optimizer sharded saving * add optimizer test & arrange checkpointIO utils * fix gemini sharding state_dict * add verbose option * add loading of master params * fix typehint * fix master/working mapping in fp16 amp |
||
---|---|---|
.. | ||
gemini | ||
legacy | ||
low_level | ||
__init__.py | ||
wrapper.py |