ColossalAI/colossalai/gemini
Jiarui Fang b3b89865e2
[Gemini] ParamOpHook -> ColoParamOpHook (#2080)
2022-12-05 17:11:06 +08:00
..
chunk [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2022-11-02 16:11:34 +08:00
memory_tracer [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2022-12-05 17:11:06 +08:00
ophooks [Gemini] ParamOpHook -> ColoParamOpHook (#2080) 2022-12-05 17:11:06 +08:00
paramhooks
__init__.py
gemini_context.py
gemini_mgr.py [Gemini] polish memstats collector (#1962) 2022-11-16 15:45:57 +08:00
placement_policy.py [Gemini] polish memstats collector (#1962) 2022-11-16 15:45:57 +08:00
stateful_tensor.py
stateful_tensor_mgr.py
tensor_placement_policy.py
tensor_utils.py [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2022-11-30 15:57:45 +08:00