ColossalAI/colossalai/gemini
Jiarui Fang b87496a66b
[hotfix] fix auto policy of test_sharded_optim_v2 (#2157)
2022-12-20 23:03:18 +08:00
..
chunk [Gemini] chunk init using runtime visited param order (#2115) 2022-12-12 18:06:16 +08:00
memory_tracer [hotfix] fix auto policy of test_sharded_optim_v2 (#2157) 2022-12-20 23:03:18 +08:00
ophooks [Gemini] update the non model data record method in runtime memory tracer (#2128) 2022-12-13 17:11:31 +08:00
paramhooks [hotfix] remove potiential circle import (#1307) 2022-07-14 13:44:26 +08:00
__init__.py [hotfix] polish chunk import (#1787) 2022-11-02 12:10:52 +08:00
gemini_context.py [hotfix] add deconstructor for stateful tensor (#848) 2022-04-24 15:03:04 +08:00
gemini_mgr.py [Gemini] update API of the chunkmemstatscollector. (#2129) 2022-12-14 00:47:06 +08:00
placement_policy.py [Gemini] polish memstats collector (#1962) 2022-11-16 15:45:57 +08:00
stateful_tensor.py [hotfix] add deconstructor for stateful tensor (#848) 2022-04-24 15:03:04 +08:00
stateful_tensor_mgr.py [gemini] accelerate adjust_layout() (#878) 2022-04-26 18:08:31 +08:00
tensor_placement_policy.py [gemini] accelerate adjust_layout() (#878) 2022-04-26 18:08:31 +08:00
tensor_utils.py [Gemini] free and allocate cuda memory by tensor.storage, add grad hook (#2040) 2022-11-30 15:57:45 +08:00