mirror of https://github.com/hpcaitech/ColossalAI
![]() * implement sharded optimizer saving * add more param info * finish implementation of sharded optimizer saving * fix bugs in optimizer sharded saving * add pp+zero test * param group loading * greedy loading of optimizer * fix bug when loading * implement optimizer sharded saving * add optimizer test & arrange checkpointIO utils * fix gemini sharding state_dict * add verbose option * add loading of master params * fix typehint * fix master/working mapping in fp16 amp |
||
---|---|---|
.. | ||
chunk | ||
memory_tracer | ||
__init__.py | ||
colo_init_context.py | ||
gemini_ddp.py | ||
gemini_hook.py | ||
gemini_mgr.py | ||
gemini_optimizer.py | ||
placement_policy.py | ||
utils.py |