New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
运行后在发送问题后出现如下报错 #3911
Comments
The error To address this issue, ensure if chunk is not None and "choices" in chunk and len(chunk["choices"]) == 0:
# Proceed with your logic here This adjustment checks that If For enhanced error handling, consider wrapping the code that accesses try:
if chunk is not None and "choices" in chunk and len(chunk["choices"]) == 0:
# Your logic here
except Exception as e:
logging.exception("Error while processing chunk: %s", e) This will help in identifying the cause of
|
INFO: 127.0.0.1:58524 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:58524 - "GET /docs HTTP/1.1" 200 OK
INFO: 127.0.0.1:58524 - "GET /openapi.json HTTP/1.1" 200 OK
2024-04-28 14:41:00,003 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:58538 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-04-28 14:41:00,008 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
2024-04-28 14:41:00,526 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:58538 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-04-28 14:41:00,530 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:58538 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK
2024-04-28 14:41:00,562 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK"
2024-04-28 14:41:16,194 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:58550 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-04-28 14:41:16,199 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
2024-04-28 14:41:16,606 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:58550 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK
2024-04-28 14:41:16,611 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:58550 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK
2024-04-28 14:41:16,642 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK"
INFO: 127.0.0.1:58550 - "POST /chat/chat HTTP/1.1" 200 OK
2024-04-28 14:41:16,654 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/chat/chat "HTTP/1.1 200 OK"
2024-04-28 14:41:17 | INFO | stdout | INFO: 127.0.0.1:58553 - "POST /v1/chat/completions HTTP/1.1" 200 OK
2024-04-28 14:41:17,386 - _client.py[line:1758] - INFO: HTTP Request: POST http://127.0.0.1:20000/v1/chat/completions "HTTP/1.1 200 OK"
2024-04-28 14:41:17 | INFO | httpx | HTTP Request: POST http://127.0.0.1:20002/worker_generate_stream "HTTP/1.1 200 OK"
2024-04-28 14:41:17,741 - utils.py[line:38] - ERROR: object of type 'NoneType' has no len()
Traceback (most recent call last):
File "C:\Users\Administrator\Desktop\langchain-master\server\utils.py", line 36, in wrap_done
await fn
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain\chains\base.py", line 385, in acall
raise e
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain\chains\base.py", line 379, in acall
await self._acall(inputs, run_manager=run_manager)
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain\chains\llm.py", line 275, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain\chains\llm.py", line 142, in agenerate
return await self.llm.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 554, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 514, in agenerate
raise exceptions[0]
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 617, in _agenerate_with_cache
return await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain_community\chat_models\openai.py", line 522, in _agenerate
return await agenerate_from_stream(stream_iter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 87, in agenerate_from_stream
async for chunk in stream:
File "C:\Users\Administrator\Desktop\langchain-master\venv\Lib\site-packages\langchain_community\chat_models\openai.py", line 493, in _astream
if len(chunk["choices"]) == 0:
^^^^^^^^^^^^^^^^^^^^^
TypeError: object of type 'NoneType' has no len()
2024-04-28 14:41:17,744 - utils.py[line:40] - ERROR: TypeError: Caught exception: object of type 'NoneType' has no len()
The text was updated successfully, but these errors were encountered: