update readme.md

pull/478/head
YWMditto 2023-11-07 20:42:07 +08:00
parent 7efd96502a
commit a3946235b2
5 changed files with 32 additions and 13 deletions

View File

@ -133,6 +133,12 @@ streamlit run web_demo.py
![demo](https://github.com/InternLM/InternLM/assets/9102141/11b60ee0-47e4-42c0-8278-3051b2f17fe4)
`web_demo_internlm.py` を使用して、InternLM 形式のモデルと直接対話できるようになりました。
まず、モデルの重みを InternLM 形式でダウンロードし、`web_demo_internlm.py` の `ckpt_dir` を置き換えてください。 次のコマンドを実行して対話します。
```python
torchrun --master_port 12331 --nnodes=1 --node_rank=0 --nproc_per_node=1 -m streamlit run web_demo_internlm.py
```
### デプロイ
[LMDeploy](https://github.com/InternLM/LMDeploy) を使って、InternLM をワンクリックでデプロイする。

View File

@ -226,6 +226,12 @@ streamlit run web_demo.py
![效果](https://github.com/InternLM/InternLM/assets/9102141/11b60ee0-47e4-42c0-8278-3051b2f17fe4)
现在您可以使用 `web_demo_internlm.py` 直接与 InternLM 格式的模型进行交互。
首先请下载 InternLM 格式的模型权重,然后替换 `web_demo_internlm.py` 中的 `ckpt_dir`。运行以下命令进行交互:
````bash
torchrun --master_port 12331 --nnodes=1 --node_rank=0 --nproc_per_node=1 -m streamlit run web_demo_internlm.py
````
### 基于InternLM高性能部署
我们使用 [LMDeploy](https://github.com/InternLM/LMDeploy) 完成 InternLM 的一键部署。

View File

@ -222,6 +222,13 @@ The effect is as follows
![demo](https://github.com/InternLM/InternLM/assets/9102141/11b60ee0-47e4-42c0-8278-3051b2f17fe4)
Now you can interact with models directly in InternLM format using `web_demo_internlm.py`.
First, please download the model weights in InternLM format, and then replace `ckpt_dir` in `web_demo_internlm.py`. Start by running the following command:
```bash
torchrun --master_port 12331 --nnodes=1 --node_rank=0 --nproc_per_node=1 -m streamlit run web_demo_internlm.py
```
### Deployment
We use [LMDeploy](https://github.com/InternLM/LMDeploy) to complete the one-click deployment of InternLM.

View File

@ -53,9 +53,9 @@ system_meta_instruction = (
- InternLM (书生·浦语) can understand and communicate fluently in the language chosen by the user such as English and 中文.
"""
)
user_prompt = "<|User|>:{user}<eoh>\n"
user_prompt = "<|User|>:{user}\n"
robot_prompt = "<|Bot|>:{robot}<eoa>\n"
cur_query_prompt = "<|User|>:{user}<eoh>\n<|Bot|>:"
cur_query_prompt = "<|User|>:{user}\n<|Bot|>:"
def combine_history(prompt):

View File

@ -92,9 +92,9 @@ system_meta_instruction = (
- InternLM (书生·浦语) can understand and communicate fluently in the language chosen by the user such as English and 中文.
"""
)
user_prompt = "<|User|>:{user}<eoh>\n"
user_prompt = "<|User|>:{user}\n"
robot_prompt = "<|Bot|>:{robot}<eoa>\n"
cur_query_prompt = "<|User|>:{user}<eoh>\n<|Bot|>:"
cur_query_prompt = "<|User|>:{user}\n<|Bot|>:"
def combine_history(prompt):