You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/gemini
HELSON 63fbba3c19
[zero] add L2 gradient clipping for ZeRO (#2112)
2 years ago
..
chunk [zero] add L2 gradient clipping for ZeRO (#2112) 2 years ago
memory_tracer [gemini] get the param visited order during runtime (#2108) 2 years ago
ophooks [gemini] get the param visited order during runtime (#2108) 2 years ago
paramhooks
__init__.py
gemini_context.py
gemini_mgr.py [Gemini] gemini use the runtime memory tracer (RMT) (#2099) 2 years ago
placement_policy.py [Gemini] polish memstats collector (#1962) 2 years ago
stateful_tensor.py
stateful_tensor_mgr.py
tensor_placement_policy.py
tensor_utils.py [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2 years ago