ColossalAI/colossalai/zero/gemini
Hongxin Liu 58d8b8a2dd
[misc] fit torch api upgradation and remove legecy import (#6093)
* [amp] fit torch's new api

* [amp] fix api call

* [amp] fix api call

* [misc] fit torch pytree api upgrade

* [misc] remove legacy import

* [misc] fit torch amp api

* [misc] fit torch amp api
2024-10-18 16:48:52 +08:00
..
chunk [fp8] Disable all_gather intranode. Disable Redundant all_gather fp8 (#6059) 2024-09-14 10:40:01 +08:00
memory_tracer [misc] fit torch api upgradation and remove legecy import (#6093) 2024-10-18 16:48:52 +08:00
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 2023-11-28 16:54:42 +08:00
gemini_ddp.py [fp8] support gemini plugin (#5978) 2024-08-09 14:09:48 +08:00
gemini_hook.py [gemini] quick fix on possible async operation (#5803) 2024-06-13 10:35:17 +08:00
gemini_mgr.py [chore] remove unnecessary assert since compute list might not be recorded 2024-05-28 05:16:02 +00:00
gemini_optimizer.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 2024-08-22 09:21:34 +08:00
placement_policy.py [misc] fit torch api upgradation and remove legecy import (#6093) 2024-10-18 16:48:52 +08:00
utils.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00