add cors support, update .ignore, update README

pull/236/head
Lucien 2023-03-26 00:42:28 +08:00
parent d38563a73d
commit 6f9c0aa228
4 changed files with 9 additions and 2 deletions

2
.gitignore vendored
View File

@ -1,2 +1,4 @@
__pycache__
.idea
THUDM
*.tar.gz

View File

@ -116,7 +116,7 @@ curl -X POST "http://127.0.0.1:8000" \
由于上述 API 不支持流式返回,故在 fastapi 的基础上增加了对 websocket 的支持。
首先安装额外的依赖 `pip install 'fastapi~=0.95.0' 'websockets~=10.4'`,然后运行 [websocket_api.py](./websocket_api.py) 即可。
首先安装额外的依赖 `pip install 'fastapi~=0.95.0' 'websockets~=10.4' 'uvicorn~=0.21.1'`,然后运行 [websocket_api.py](./websocket_api.py) 即可。
```shell
python websocket_api.py

View File

@ -113,7 +113,7 @@ The returned value is
This DEMO showed that you can embed ChatGLM-6B to your own website through websocket. HTML file: [websocket_demo.html](./websocket_demo.html).
First install the additional dependency `pip install 'fastapi~=0.95.0' 'websockets~=10.4'`. Then run [websocket_api.py](./websocket_api.py) in the repo.
First install the additional dependency `pip install 'fastapi~=0.95.0' 'websockets~=10.4' 'uvicorn~=0.21.1'`. Then run [websocket_api.py](./websocket_api.py) in the repo.
```shell
python websocket_api.py

View File

@ -1,5 +1,6 @@
from fastapi import FastAPI, WebSocket, WebSocketDisconnect
from fastapi.responses import HTMLResponse
from fastapi.middleware.cors import CORSMiddleware
from transformers import AutoTokenizer, AutoModel
import uvicorn
@ -10,6 +11,10 @@ model = AutoModel.from_pretrained(pretrained, trust_remote_code=True).half().cud
model = model.eval()
app = FastAPI()
app.add_middleware(
CORSMiddleware
)
with open('websocket_demo.html') as f:
html = f.read()