From 7a60161035e937db04a4521adfc8306d091b857e Mon Sep 17 00:00:00 2001 From: Tong Li Date: Wed, 6 Nov 2024 17:24:08 +0800 Subject: [PATCH] update readme (#6116) --- applications/ColossalChat/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/applications/ColossalChat/README.md b/applications/ColossalChat/README.md index ef904b864..690c39818 100755 --- a/applications/ColossalChat/README.md +++ b/applications/ColossalChat/README.md @@ -284,7 +284,7 @@ For more details, see [`inference/`](https://github.com/hpcaitech/ColossalAI/tre ## O1 Journey ### Inference with Self-refined MCTS We provide the implementation of MCT Self-Refine (MCTSr) algorithm, an innovative integration of Large Language Models with Monte Carlo Tree Search. -To run inference with MCTS, simply use the following script. +You can serve model using vLLM and update the config file in `Qwen32B_prompt_CFG` and then run the following script. ```python from coati.reasoner.guided_search.mcts import MCTS from coati.reasoner.guided_search.prompt_store.qwen import Qwen32B_prompt_CFG