Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Connection closed error on Vercel #291

Open
angelhodar opened this issue Mar 26, 2024 · 19 comments
Open

Connection closed error on Vercel #291

angelhodar opened this issue Mar 26, 2024 · 19 comments

Comments

@angelhodar
Copy link

angelhodar commented Mar 26, 2024

Hi! I have forked the repo to test the new AI SDK 3.0 and when deployed to Vercel it was crashing, giving an Application error client side after a few seconds streaming the chat response, while locally it was working fine.

Watching at the browser console, it seems like it was a Connection closed error from the server, so I supposed it was because of serverless function timeout. I noticed that the runtime for the chat/[id]/page.tsx was not edge as in the previous chatbot version. I have included this 2 lines (second one optional) in the page and now its not giving timeout:

export const runtime = 'edge'
export const preferredRegion = 'home'

Any reason why it was deleted with this new version? I would like confirmation just in case I am missing something. Thanks in advance!!

@nikohann
Copy link

The duration limit for Vercel functions in the hobby plan may be the cause.

https://vercel.com/docs/functions/limitations

@athrael-soju
Copy link

@angelhodar Did you have a response that streamed longer than 10 seconds?

@angelhodar
Copy link
Author

@angelhodar Did you have a response that streamed longer than 10 seconds?

I didnt measure it but yes, more or less. The moment I changed to the edge runtime the problem was solved, and as it wasn't failing locally I suppose that function timeout was the issue, but as nobody has mentioned it in the issues I am a bit confused...

@TimchaStudio
Copy link

I have the same question

@AireWong
Copy link

I also have the same question

@AireWong
Copy link

AireWong commented Mar 31, 2024

Has anyone solved it?
Website: https://oai-chatbot-ui.vercel.app

Ask 5 questions quickly, the error will appear

image

image

@angelhodar
Copy link
Author

@AireWong Have you tried adding the 2 lines of code I added in the first message to the chat/[id]/page.tsx route?

@AireWong
Copy link

AireWong commented Apr 1, 2024

@angelhodar I still encounter the same issue even when deploying through your repository.

Ask multiple questions quickly, and a connection error will occur.
My website: https://oai-chat.vercel.app/
The KV Database Instance location is Singapore.
My Fork repository: https://github.com/AireWong/ai-chatbot

@angelhodar
Copy link
Author

@AireWong Oh I have tried now and its giving the connection error again lol. It was working perfectly a few days ago... Maybe someone from Vercel can give us some guidance?

@AireWong
Copy link

AireWong commented Apr 2, 2024

Many users have encountered the same issue. Can any genius solve it?

@AmmarByFar
Copy link

Felt like a tried everything, tried adding the 'edge' runtime, maxDuration setting. nothing is working. I even upgraded to Pro for longer execution time and still can't seem to run the streaming chat for more than like 15-20 seconds....

@feliche93
Copy link

I am also currently running into the same issue 😬 for me it only happens when I set it to:

export const runtime = 'edge'
export const preferredRegion = 'fra1'

Leaving it as a serverless function it all works as expected 😬

@AireWong
Copy link

AireWong commented Apr 7, 2024

@feliche93 this doesn't work for me.

AireWong@bcc4aa7

@olivermontes
Copy link

olivermontes commented Apr 15, 2024

I try this in my case and its works, long LLM response is limited by 15s vercel Pro account

vercel.json (root path)

{
  "functions": {
    "app/[domain]/chat/[id]/page.tsx": {
      "maxDuration": 60
    }
  }
}

Links

@eb379
Copy link

eb379 commented Apr 23, 2024

If the issue isn't intermittent for you, I recommend double checking you have credits on your openAI account as per the comment here: #309 (comment)

It took a couple of minutes before the billing settings updated once I had loaded credit there.

@angelhodar
Copy link
Author

If the issue isn't intermittent for you, I recommend double checking you have credits on your openAI account as per the comment here: #309 (comment)

It took a couple of minutes before the billing settings updated once I had loaded credit there.

Finally I have moved to Groq with Llama3 model and it goes so fast that there is no opportunity to fail hahaha. I dont think billing was the problem

@themataleao
Copy link
Contributor

If you do not have a Pro plan the timeout is limited to 10s.

What worked for me is setting the following exports (I bought the Pro account):

export const runtime = 'edge'
export const preferredRegion = 'home'
export const maxDuration = 300

to
app/(chat)/chat/[id]/page.tsx
app/(chat)/page.tsx

You can add the exports to other routes if needed.

As @angelhodar pointed out, reducing inference time is always a good idea.

@TimchaStudio
Copy link

The old version does not have this issue. --> https://next.oaiui.com

@angelhodar
Copy link
Author

Hey everyone, Vercel has increased the hobby functions limit to 60s so this issue should be solved: https://twitter.com/vercel_changes/status/1788600830649639092?t=wcRiZtxoWsuch6zD_EVMsw&s=19

Can you verify?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants