ColossalAI/colossalai/gemini
YH 1a229045af
Add interface for colo tesnor dp size (#3227)
2023-03-27 09:42:21 +08:00
..
chunk Add interface for colo tesnor dp size (#3227) 2023-03-27 09:42:21 +08:00
memory_tracer [example] update gpt example for larger model scale (#2211) 2022-12-28 13:54:08 +08:00
ophooks [doc] add deepspeed citation and copyright (#2996) 2023-03-04 20:08:11 +08:00
paramhooks [hotfix] remove potiential circle import (#1307) 2022-07-14 13:44:26 +08:00
__init__.py [hotfix] polish chunk import (#1787) 2022-11-02 12:10:52 +08:00
gemini_context.py [NFC] polish colossalai/gemini/gemini_context.py code style (#2690) 2023-02-14 11:55:23 +08:00
gemini_mgr.py [hotfix] fix zero ddp warmup check (#2545) 2023-02-02 16:42:38 +08:00
placement_policy.py [Gemini] fix the convert_to_torch_module bug (#2269) 2023-01-03 15:55:35 +08:00
stateful_tensor.py [hotfix] add deconstructor for stateful tensor (#848) 2022-04-24 15:03:04 +08:00
stateful_tensor_mgr.py [gemini] accelerate adjust_layout() (#878) 2022-04-26 18:08:31 +08:00
tensor_placement_policy.py [gemini] accelerate adjust_layout() (#878) 2022-04-26 18:08:31 +08:00
tensor_utils.py [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2022-11-30 15:57:45 +08:00