Update web_demo.py

1. 实测 `from transformers import AutoModelForCausalLM, AutoTokenizer` 必须先引入 `AutoTokenizer` 再引入`AutoModelForCausalLM` 否则会报以下错

```bash
Traceback (most recent call last):
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
    exec(code, module.__dict__)
  File "/root/code/InternLM2/web_demo.py", line 10, in <module>
    from transformers import AutoModelForCausalLM, AutoTokenizer
ImportError: cannot import name 'AutoModelForCausalLM' from 'transformers' (/root/.local/lib/python3.10/site-packages/transformers/__init__.py)
load model begin.
```

2. 必须用 steamlit 命令,不能用 python 命令运行 web_demo.py 否则会报以下错

``` bash
2024-01-19 15:02:13.807 Session state does not function when running a script without `streamlit run`
Traceback (most recent call last):
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 378, in __getitem__
    return self._getitem(widget_id, key)
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 423, in _getitem
    raise KeyError
KeyError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/state/session_state_proxy.py", line 119, in __getattr__
    return self[key]
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/state/session_state_proxy.py", line 90, in __getitem__
    return get_session_state()[key]
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/state/safe_session_state.py", line 113, in __getitem__
    return self._state[key]
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 380, in __getitem__
    raise KeyError(_missing_key_error_message(key))
KeyError: 'st.session_state has no key "messages". Did you forget to initialize it? More info: https://docs.streamlit.io/library/advanced-features/session-state#initialization'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "//root/code/InternLM2/web_demo.py", line 251, in <module>
    main()
  File "//root/code/InternLM2/web_demo.py", line 214, in main
    for message in st.session_state.messages:
  File "/root/.conda/envs/InternLM/lib/python3.10/site-packages/streamlit/runtime/state/session_state_proxy.py", line 121, in __getattr__
    raise AttributeError(_missing_attr_error_message(key))
AttributeError: st.session_state has no attribute "messages". Did you forget to initialize it? More info: https://docs.streamlit.io/library/advanced-features/session-state#initialization
```
pull/637/head
zhengjie.xu 2024-01-19 15:41:53 +08:00 committed by GitHub
parent 4fd9391594
commit 743e464ecc
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
1 changed files with 2 additions and 1 deletions

View File

@ -5,6 +5,7 @@ Please refer to these links below for more information:
1. streamlit chat example: https://docs.streamlit.io/knowledge-base/tutorials/build-conversational-apps
2. chatglm2: https://github.com/THUDM/ChatGLM2-6B
3. transformers: https://github.com/huggingface/transformers
Please run with the command `streamlit run path/to/web_demo.py --server.address=0.0.0.0 --server.port 7860` instead of using `python path/to/web_demo.py`.
"""
import copy
@ -15,7 +16,7 @@ from typing import Callable, List, Optional
import streamlit as st
import torch
from torch import nn
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers import AutoTokenizer, AutoModelForCausalLM
from transformers.generation.utils import LogitsProcessorList, StoppingCriteriaList
from transformers.utils import logging