diff --git a/README-zh-Hans.md b/README-zh-Hans.md
index f2c4edb75..f9ae0a269 100644
--- a/README-zh-Hans.md
+++ b/README-zh-Hans.md
@@ -35,6 +35,7 @@
GPT-2
BERT
PaLM
+ OPT
@@ -130,7 +131,13 @@ Colossal-AI 为您提供了一系列并行组件。我们的目标是让您的
### PaLM
- [PaLM-colossalai](https://github.com/hpcaitech/PaLM-colossalai): 可扩展的谷歌 Pathways Language Model ([PaLM](https://ai.googleblog.com/2022/04/pathways-language-model-palm-scaling-to.html)) 实现。
-请访问我们的[文档和教程](https://www.colossalai.org/)以了解详情。
+### OPT
+
+
+- [Open Pretrained Transformer (OPT)](https://github.com/facebookresearch/metaseq), 由Meta发布的1750亿语言模型,由于完全公开了预训练参数权重,因此促进了下游任务和应用部署的发展。
+- 加速40%,仅用几行代码以低成本微调OPT。[[样例]](https://github.com/hpcaitech/ColossalAI-Examples/tree/main/language/opt)
+
+请访问我们的 [文档](https://www.colossalai.org/) 和 [例程](https://github.com/hpcaitech/ColossalAI-Examples) 以了解详情。
(返回顶端)
diff --git a/README.md b/README.md
index 4ddfb136a..3fc964b06 100644
--- a/README.md
+++ b/README.md
@@ -35,6 +35,7 @@
GPT-2
BERT
PaLM
+ OPT
@@ -135,7 +136,13 @@ distributed training and inference in a few lines.
### PaLM
- [PaLM-colossalai](https://github.com/hpcaitech/PaLM-colossalai): Scalable implementation of Google's Pathways Language Model ([PaLM](https://ai.googleblog.com/2022/04/pathways-language-model-palm-scaling-to.html)).
-Please visit our [documentation and tutorials](https://www.colossalai.org/) for more details.
+### OPT
+
+
+- [Open Pretrained Transformer (OPT)](https://github.com/facebookresearch/metaseq), a 175-Billion parameter AI language model released by Meta, which stimulates AI programmers to perform various downstream tasks and application deployments because public pretrained model weights.
+- 40% speedup fine-tuning OPT at low cost in lines. [[Example]](https://github.com/hpcaitech/ColossalAI-Examples/tree/main/language/opt)
+
+Please visit our [documentation](https://www.colossalai.org/) and [examples](https://github.com/hpcaitech/ColossalAI-Examples) for more details.
(back to top)