-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Connection closed error on Vercel #291
Comments
The duration limit for Vercel functions in the hobby plan may be the cause. |
@angelhodar Did you have a response that streamed longer than 10 seconds? |
I didnt measure it but yes, more or less. The moment I changed to the edge runtime the problem was solved, and as it wasn't failing locally I suppose that function timeout was the issue, but as nobody has mentioned it in the issues I am a bit confused... |
I have the same question |
I also have the same question |
Has anyone solved it? Ask 5 questions quickly, the error will appear |
@AireWong Have you tried adding the 2 lines of code I added in the first message to the |
@angelhodar I still encounter the same issue even when deploying through your repository. Ask multiple questions quickly, and a connection error will occur. |
@AireWong Oh I have tried now and its giving the connection error again lol. It was working perfectly a few days ago... Maybe someone from Vercel can give us some guidance? |
Many users have encountered the same issue. Can any genius solve it? |
Felt like a tried everything, tried adding the 'edge' runtime, maxDuration setting. nothing is working. I even upgraded to Pro for longer execution time and still can't seem to run the streaming chat for more than like 15-20 seconds.... |
I am also currently running into the same issue 😬 for me it only happens when I set it to:
Leaving it as a serverless function it all works as expected 😬 |
@feliche93 this doesn't work for me. |
I try this in my case and its works, long LLM response is limited by 15s vercel Pro account vercel.json (root path)
Links |
If the issue isn't intermittent for you, I recommend double checking you have credits on your openAI account as per the comment here: #309 (comment) It took a couple of minutes before the billing settings updated once I had loaded credit there. |
Finally I have moved to Groq with Llama3 model and it goes so fast that there is no opportunity to fail hahaha. I dont think billing was the problem |
If you do not have a Pro plan the timeout is limited to 10s. What worked for me is setting the following exports (I bought the Pro account):
to You can add the exports to other routes if needed. As @angelhodar pointed out, reducing inference time is always a good idea. |
The old version does not have this issue. --> https://next.oaiui.com |
Hey everyone, Vercel has increased the hobby functions limit to 60s so this issue should be solved: https://twitter.com/vercel_changes/status/1788600830649639092?t=wcRiZtxoWsuch6zD_EVMsw&s=19 Can you verify? |
Hi! I have forked the repo to test the new AI SDK 3.0 and when deployed to Vercel it was crashing, giving an
Application error
client side after a few seconds streaming the chat response, while locally it was working fine.Watching at the browser console, it seems like it was a
Connection closed error
from the server, so I supposed it was because of serverless function timeout. I noticed that the runtime for the chat/[id]/page.tsx was not edge as in the previous chatbot version. I have included this 2 lines (second one optional) in the page and now its not giving timeout:Any reason why it was deleted with this new version? I would like confirmation just in case I am missing something. Thanks in advance!!
The text was updated successfully, but these errors were encountered: