ColossalAI/colossalai/zero/gemini
Hongxin Liu 7303801854
[llama] fix training and inference scripts (#5384)
* [llama] refactor inference example to fit sft

* [llama] fix training script to fit gemini

* [llama] fix inference script
2024-02-19 16:41:04 +08:00
..
chunk [feat] refactored extension module (#5298) 2024-01-25 17:01:48 +08:00
memory_tracer [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 2023-11-28 16:54:42 +08:00
gemini_ddp.py [llama] fix training and inference scripts (#5384) 2024-02-19 16:41:04 +08:00
gemini_hook.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
gemini_mgr.py [npu] add npu support for gemini and zero (#5067) 2023-11-20 16:12:41 +08:00
gemini_optimizer.py [checkpointio] fix gemini and hybrid parallel optim checkpoint (#5347) 2024-02-01 16:13:06 +08:00
placement_policy.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00
utils.py [npu] change device to accelerator api (#5239) 2024-01-09 10:20:05 +08:00