You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/gemini
Zihao 6a9158f1fa
[Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040)
2 years ago
..
chunk [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago
memory_tracer [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2 years ago
ophooks [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2 years ago
paramhooks
__init__.py
gemini_context.py
gemini_mgr.py [Gemini] polish memstats collector (#1962) 2 years ago
placement_policy.py [Gemini] polish memstats collector (#1962) 2 years ago
stateful_tensor.py
stateful_tensor_mgr.py
tensor_placement_policy.py
tensor_utils.py [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2 years ago