Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

启动startup.py -p #3900

Closed
Ptianyu opened this issue Apr 26, 2024 · 1 comment
Closed

启动startup.py -p #3900

Ptianyu opened this issue Apr 26, 2024 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@Ptianyu
Copy link

Ptianyu commented Apr 26, 2024

(langchain_chat_env) root@3b31cc983ec9:/home/langchain/Langchain-Chatchat-master# python startup.py -p

==============================Langchain-Chatchat Configuration==============================
操作系统:Linux-5.19.0-42-generic-x86_64-with-glibc2.31.
python版本:3.11.7 (main, Dec 15 2023, 18:12:31) [GCC 11.2.0]
项目版本:v0.2.10
langchain版本:0.0.354. fastchat版本:0.2.35

当前使用的分词器:ChineseRecursiveTextSplitter
当前启动的LLM模型:['ollama-api'] @ cuda
{'api_base_url': 'http://192.168.1.110:11434',
'api_key': 'ollama',
'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'model_name': 'llama3',
'online_api': True,
'openai_proxy': '',
'port': 11434}
当前Embbedings模型: bge-large-zh @ cuda
==============================Langchain-Chatchat Configuration==============================

2024-04-26 16:14:21,549 - startup.py[line:655] - INFO: 正在启动服务:
2024-04-26 16:14:21,549 - startup.py[line:656] - INFO: 如需查看 llm_api 日志,请前往 /home/langchain/Langchain-Chatchat-master/logs
usage: startup.py [-h] [-a] [--all-api] [--llm-api] [-o] [-m] [-n MODEL_NAME [MODEL_NAME ...]] [-c CONTROLLER_ADDRESS]
[--api] [-p] [-w] [-q] [-i]

options:
-h, --help show this help message and exit
-a, --all-webui run fastchat's controller/openai_api/model_worker servers, run api.py and webui.py
--all-api run fastchat's controller/openai_api/model_worker servers, run api.py
--llm-api run fastchat's controller/openai_api/model_worker servers
-o, --openai-api run fastchat's controller/openai_api servers
-m, --model-worker run fastchat's model_worker server with specified model name. specify --model-name if not
using default LLM_MODELS
-n MODEL_NAME [MODEL_NAME ...], --model-name MODEL_NAME [MODEL_NAME ...]
specify model name for model worker. add addition names with space seperated to start multiple
model workers.
-c CONTROLLER_ADDRESS, --controller CONTROLLER_ADDRESS
specify controller address the worker is registered to. default is FSCHAT_CONTROLLER
--api run api.py server
-p, --api-worker run online model api such as zhipuai
-w, --webui run webui.py server
-q, --quiet 减少fastchat服务log信息
-i, --lite 以Lite模式运行:仅支持在线API的LLM对话、搜索引擎对话

@Ptianyu Ptianyu added the bug Something isn't working label Apr 26, 2024
Copy link

dosubot bot commented Apr 26, 2024

It looks like you're trying to use the -p option with startup.py to run an online model API worker, such as zhipuai. This option is intended for that purpose, but if the service isn't starting as expected and you're only seeing the usage options, it might indicate a problem with the setup or configuration.

Ensure that your environment meets all the necessary prerequisites for the -p option to function correctly. This includes having a CUDA-compatible GPU with the appropriate NVIDIA drivers, all required Python packages installed, necessary configuration files and an initialized database in place, and the environment set up as described in the project's Dockerfile. Additionally, make sure that the necessary ports are accessible for the application's network operations.

If you're following all the setup instructions correctly and still facing issues, checking for any error messages or exceptions thrown when executing the startup.py -p command could provide more insight into the problem.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@zRzRzRzRzRzRzR zRzRzRzRzRzRzR self-assigned this May 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants