You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/gemini
Zihao 20e255d4e8
MemStatsCollectorStatic (#1765)
2 years ago
..
chunk [hotfix] fix zero's incompatibility with checkpoint in torch-1.12 (#1786) 2 years ago
memory_tracer MemStatsCollectorStatic (#1765) 2 years ago
ophooks [hotfix] remove potiential circle import (#1307) 2 years ago
paramhooks [hotfix] remove potiential circle import (#1307) 2 years ago
__init__.py [hotfix] polish chunk import (#1787) 2 years ago
gemini_context.py [hotfix] add deconstructor for stateful tensor (#848) 3 years ago
gemini_mgr.py MemStatsCollectorStatic (#1765) 2 years ago
placement_policy.py [zero] add constant placement policy (#1705) 2 years ago
stateful_tensor.py [hotfix] add deconstructor for stateful tensor (#848) 3 years ago
stateful_tensor_mgr.py [gemini] accelerate adjust_layout() (#878) 3 years ago
tensor_placement_policy.py [gemini] accelerate adjust_layout() (#878) 3 years ago
tensor_utils.py [gemini] add GeminiMemoryManger (#832) 3 years ago