Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add enviroment variable for openai timeout, backoff and max retries #7610

Closed
nickprock opened this issue Apr 29, 2024 · 7 comments · Fixed by #7653
Closed

Add enviroment variable for openai timeout, backoff and max retries #7610

nickprock opened this issue Apr 29, 2024 · 7 comments · Fixed by #7653
Assignees

Comments

@nickprock
Copy link
Contributor

Is your feature request related to a problem? Please describe.
In version 1, haystack allows to set three environment variables to manage:

  • OPENAI_TIMEOUT
  • OPENAI_MAX_RETRIES
  • OPENAI_BACKOFF

It's useful also for the versione 2.0

@CarlosFerLo
Copy link
Contributor

Working on it.

@CarlosFerLo
Copy link
Contributor

@nickprock, I've been locking for this environment variables on the v1.x and v1.25.x branches and cannot find any reference to them, could you give me more info please.

@nickprock
Copy link
Contributor Author

nickprock commented Apr 30, 2024

In 1.25 there are these environment variables

# Any remote API (OpenAI, Cohere etc.)
HAYSTACK_REMOTE_API_BACKOFF_SEC = "HAYSTACK_REMOTE_API_BACKOFF_SEC"
HAYSTACK_REMOTE_API_MAX_RETRIES = "HAYSTACK_REMOTE_API_MAX_RETRIES"
HAYSTACK_REMOTE_API_TIMEOUT_SEC = "HAYSTACK_REMOTE_API_TIMEOUT_SEC"
HAYSTACK_PROMPT_TEMPLATE_ALLOWED_FUNCTIONS = "HAYSTACK_PROMPT_TEMPLATE_ALLOWED_FUNCTIONS"

https://github.com/deepset-ai/haystack/blob/v1.25.5/haystack/environment.py

I talked with @vblagoje on Discord here: https://discord.com/channels/993534733298450452/1232706764294717511

@CarlosFerLo
Copy link
Contributor

@nickprock the discord invitation does not work. Could you send a new one please.

@nickprock
Copy link
Contributor Author

@CarlosFerLo
Copy link
Contributor

Not really getting anywhere, and I will be out for the next few days, so feel free to proceed with this implementation. :)

@sjrl
Copy link
Contributor

sjrl commented May 13, 2024

Hey @masci just wanted to give you a heads up that deepset Cloud + Sol will want this since we used this extensively for client pipelines that timed out when using bigger models like GPT4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants