You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/gemini
YH 1a229045af
Add interface for colo tesnor dp size (#3227)
2 years ago
..
chunk Add interface for colo tesnor dp size (#3227) 2 years ago
memory_tracer [example] update gpt example for larger model scale (#2211) 2 years ago
ophooks [doc] add deepspeed citation and copyright (#2996) 2 years ago
paramhooks [hotfix] remove potiential circle import (#1307) 2 years ago
__init__.py [hotfix] polish chunk import (#1787) 2 years ago
gemini_context.py [NFC] polish colossalai/gemini/gemini_context.py code style (#2690) 2 years ago
gemini_mgr.py [hotfix] fix zero ddp warmup check (#2545) 2 years ago
placement_policy.py [Gemini] fix the convert_to_torch_module bug (#2269) 2 years ago
stateful_tensor.py
stateful_tensor_mgr.py
tensor_placement_policy.py
tensor_utils.py [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2 years ago