Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

stream=true returns improper response format #41

Open
kwekewk opened this issue Jul 12, 2023 · 1 comment
Open

stream=true returns improper response format #41

kwekewk opened this issue Jul 12, 2023 · 1 comment

Comments

@kwekewk
Copy link

kwekewk commented Jul 12, 2023

When using the stream=true parameter, the structure response format is incorrect compared to when stream=false. Some apps give error

Steps to reproduce

  1. Send request with stream=false
--data-raw '{"messages":[{"role":"user","content":"sayyes"}],"model":"gpt-3.5-turbo","temperature":1,"presence_penalty":0,"top_p":1,"frequency_penalty":0,"stream":false}'    

Response:

{"id":"chatcmpl-CuQKKLmyuGrdzaiqxQefJCyJetvrV","object":"chat.completion","created":1689148811,"choices":[{"index":0,"message":{"role":"assistant","content":"Yes! How may I assist you today?","name":""},"finish_reason":"stop"}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}
  1. Send request with stream=true
--data-raw '{"messages":[{"role":"user","content":"sayyes"}],"model":"gpt-3.5-turbo","temperature":1,"presence_penalty":0,"top_p":1,"frequency_penalty":0,"stream":true}'

Response:

data: {"choices":[{"delta":{"role":"assistant"},"finish_reason":null,"index":0}],"created":1689149092,"id":"chatcmpl-nuNuoobJbxsRfrJgenDXhsIJJDptP","model":"gpt-3.5-turbo","object":"chat.completion.chunk"}  

data: {"choices":[{"delta":{"content":"Yes! "},"finish_reason":null,"index":0}],"created":1689149093,"id":"chatcmpl-nuNuoobJbxsRfrJgenDXhsIJJDptP","model":"gpt-3.5-turbo","object":"chat.completion.chunk"}

Expected behavior

The response format should be consistent whether stream is true or false.

@juzeon
Copy link
Owner

juzeon commented Jul 13, 2023

The response differs when the "stream" parameter is set to either true or false. Please refer to the following link for more information: https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants