mirror of https://github.com/THUDM/ChatGLM-6B
grammar / spelling updates
parent
6cda36633e
commit
1b57684158
|
@ -4,7 +4,7 @@
|
|||
|
||||
ChatGLM-6B is an open bilingual language model based on [General Language Model (GLM)](https://github.com/THUDM/GLM) framework, with 6.2 billion parameters. With the quantization technique, users can deploy locally on consumer-grade graphics cards (only 6GB of GPU memory is required at the INT4 quantization level).
|
||||
|
||||
ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dialogue. The model is trained for about 1T tokens of Chinese and English corpus, supplemented by supervised fine-tuning, feedback bootstrap, and reinforcement learning wit human feedback. With only about 6.2 billion parameters, the model is able to generate answers that are in line with human preference.
|
||||
ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dialogue. The model is trained for about 1T tokens of Chinese and English corpus, supplemented by supervised fine-tuning, feedback bootstrap, and reinforcement learning with human feedback. With only about 6.2 billion parameters, the model is able to generate answers that are in line with human preference.
|
||||
|
||||
## Hardware Requirements
|
||||
|
||||
|
@ -18,7 +18,7 @@ ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dial
|
|||
|
||||
### Environment Setup
|
||||
|
||||
Install the requirements with pip: `pip install -r requirements.txt`. `transformers` library version is recommended to be `4.26.1`, but theoretically any version no lower than `4.23.1` is acceptable.
|
||||
Install the requirements with pip: `pip install -r requirements.txt`. `transformers` suggested library version is `4.26.1`, but theoretically any version greater than `4.23.1` is acceptable.
|
||||
|
||||
### Usage
|
||||
|
||||
|
|
Loading…
Reference in New Issue