mirror of https://github.com/InternLM/InternLM
update readme.md
parent
7efd96502a
commit
a3946235b2
|
@ -133,6 +133,12 @@ streamlit run web_demo.py
|
|||
|
||||

|
||||
|
||||
`web_demo_internlm.py` を使用して、InternLM 形式のモデルと直接対話できるようになりました。
|
||||
まず、モデルの重みを InternLM 形式でダウンロードし、`web_demo_internlm.py` の `ckpt_dir` を置き換えてください。 次のコマンドを実行して対話します。
|
||||
```python
|
||||
torchrun --master_port 12331 --nnodes=1 --node_rank=0 --nproc_per_node=1 -m streamlit run web_demo_internlm.py
|
||||
```
|
||||
|
||||
### デプロイ
|
||||
|
||||
[LMDeploy](https://github.com/InternLM/LMDeploy) を使って、InternLM をワンクリックでデプロイする。
|
||||
|
|
|
@ -226,6 +226,12 @@ streamlit run web_demo.py
|
|||
|
||||

|
||||
|
||||
现在您可以使用 `web_demo_internlm.py` 直接与 InternLM 格式的模型进行交互。
|
||||
首先请下载 InternLM 格式的模型权重,然后替换 `web_demo_internlm.py` 中的 `ckpt_dir`。运行以下命令进行交互:
|
||||
````bash
|
||||
torchrun --master_port 12331 --nnodes=1 --node_rank=0 --nproc_per_node=1 -m streamlit run web_demo_internlm.py
|
||||
````
|
||||
|
||||
### 基于InternLM高性能部署
|
||||
|
||||
我们使用 [LMDeploy](https://github.com/InternLM/LMDeploy) 完成 InternLM 的一键部署。
|
||||
|
|
|
@ -222,6 +222,13 @@ The effect is as follows
|
|||
|
||||

|
||||
|
||||
Now you can interact with models directly in InternLM format using `web_demo_internlm.py`.
|
||||
First, please download the model weights in InternLM format, and then replace `ckpt_dir` in `web_demo_internlm.py`. Start by running the following command:
|
||||
```bash
|
||||
torchrun --master_port 12331 --nnodes=1 --node_rank=0 --nproc_per_node=1 -m streamlit run web_demo_internlm.py
|
||||
```
|
||||
|
||||
|
||||
### Deployment
|
||||
|
||||
We use [LMDeploy](https://github.com/InternLM/LMDeploy) to complete the one-click deployment of InternLM.
|
||||
|
|
|
@ -53,9 +53,9 @@ system_meta_instruction = (
|
|||
- InternLM (书生·浦语) can understand and communicate fluently in the language chosen by the user such as English and 中文.
|
||||
"""
|
||||
)
|
||||
user_prompt = "<|User|>:{user}<eoh>\n"
|
||||
user_prompt = "<|User|>:{user}\n"
|
||||
robot_prompt = "<|Bot|>:{robot}<eoa>\n"
|
||||
cur_query_prompt = "<|User|>:{user}<eoh>\n<|Bot|>:"
|
||||
cur_query_prompt = "<|User|>:{user}\n<|Bot|>:"
|
||||
|
||||
|
||||
def combine_history(prompt):
|
||||
|
|
|
@ -92,9 +92,9 @@ system_meta_instruction = (
|
|||
- InternLM (书生·浦语) can understand and communicate fluently in the language chosen by the user such as English and 中文.
|
||||
"""
|
||||
)
|
||||
user_prompt = "<|User|>:{user}<eoh>\n"
|
||||
user_prompt = "<|User|>:{user}\n"
|
||||
robot_prompt = "<|Bot|>:{robot}<eoa>\n"
|
||||
cur_query_prompt = "<|User|>:{user}<eoh>\n<|Bot|>:"
|
||||
cur_query_prompt = "<|User|>:{user}\n<|Bot|>:"
|
||||
|
||||
|
||||
def combine_history(prompt):
|
||||
|
|
Loading…
Reference in New Issue