From 682af6139671030c5b4da69246197627f189c404 Mon Sep 17 00:00:00 2001 From: binmakeswell Date: Wed, 29 Mar 2023 02:35:10 +0800 Subject: [PATCH] [doc] add ColossalChat (#3297) * [doc] add ColossalChat --- README.md | 21 +++++++++++++++------ docs/README-zh-Hans.md | 20 ++++++++++++++------ 2 files changed, 29 insertions(+), 12 deletions(-) diff --git a/README.md b/README.md index 3098d72b4..77c3471d9 100644 --- a/README.md +++ b/README.md @@ -66,7 +66,7 @@
  • Colossal-AI for Real World Applications @@ -214,22 +214,31 @@ Please visit our [documentation](https://www.colossalai.org/) and [examples](htt

    (back to top)

    ## Colossal-AI in the Real World -### ChatGPT -A low-cost [ChatGPT](https://openai.com/blog/chatgpt/) equivalent implementation process. [[code]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/ChatGPT) [[blog]](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) -

    + +### ColossalChat + +

    + + + +
    + +[ColossalChat](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat): An open-source solution for cloning [ChatGPT](https://openai.com/blog/chatgpt/) with a complete RLHF pipeline. [[code]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat) [[blog]](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) [[demo]](https://chat.colossalai.org) + +

    - Up to 7.73 times faster for single server training and 1.42 times faster for single-GPU inference -

    +

    - Up to 10.3x growth in model capacity on one GPU - A mini demo training process requires only 1.62GB of GPU memory (any consumer-grade GPU) -

    +

    diff --git a/docs/README-zh-Hans.md b/docs/README-zh-Hans.md index 81c45abfd..4be923eca 100644 --- a/docs/README-zh-Hans.md +++ b/docs/README-zh-Hans.md @@ -66,7 +66,7 @@
  • Colossal-AI 成功案例 @@ -212,22 +212,30 @@ Colossal-AI 为您提供了一系列并行组件。我们的目标是让您的

    (返回顶端)

    ## Colossal-AI 成功案例 -### ChatGPT -低成本复现[ChatGPT](https://openai.com/blog/chatgpt/)完整流程 [[代码]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/ChatGPT) [[博客]](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) -

    +### ColossalChat + +

    + + + +
    + +[ColossalChat](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat): 完整RLHF流程0门槛克隆 [ChatGPT](https://openai.com/blog/chatgpt/) [[代码]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat) [[博客]](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) [[在线样例]](https://chat.colossalai.org) + +

    - 最高可提升单机训练速度7.73倍,单卡推理速度1.42倍 -

    +

    - 单卡模型容量最多提升10.3倍 - 最小demo训练流程最低仅需1.62GB显存 (任意消费级GPU) -

    +