From 1b57684158283df0e0abb690b34d711c5052f07f Mon Sep 17 00:00:00 2001 From: Brian J King Date: Sat, 18 Mar 2023 20:46:37 -0700 Subject: [PATCH] grammar / spelling updates --- README_en.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README_en.md b/README_en.md index ca85c21..3d6dad8 100644 --- a/README_en.md +++ b/README_en.md @@ -4,7 +4,7 @@ ChatGLM-6B is an open bilingual language model based on [General Language Model (GLM)](https://github.com/THUDM/GLM) framework, with 6.2 billion parameters. With the quantization technique, users can deploy locally on consumer-grade graphics cards (only 6GB of GPU memory is required at the INT4 quantization level). -ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dialogue. The model is trained for about 1T tokens of Chinese and English corpus, supplemented by supervised fine-tuning, feedback bootstrap, and reinforcement learning wit human feedback. With only about 6.2 billion parameters, the model is able to generate answers that are in line with human preference. +ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dialogue. The model is trained for about 1T tokens of Chinese and English corpus, supplemented by supervised fine-tuning, feedback bootstrap, and reinforcement learning with human feedback. With only about 6.2 billion parameters, the model is able to generate answers that are in line with human preference. ## Hardware Requirements @@ -18,7 +18,7 @@ ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dial ### Environment Setup -Install the requirements with pip: `pip install -r requirements.txt`. `transformers` library version is recommended to be `4.26.1`, but theoretically any version no lower than `4.23.1` is acceptable. +Install the requirements with pip: `pip install -r requirements.txt`. `transformers` suggested library version is `4.26.1`, but theoretically any version greater than `4.23.1` is acceptable. ### Usage