-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
download_cached_file should retry if there is a http error code #3804
Comments
Note unlike #3803 , this is not a regression as the pre-requests code didn't do this. |
Note that I've also looked for a sane requests wrapper which provides simplified retry, and oddly not found one. |
Note, if there is a retry, then surely there needs to be a new status code check somewhere, but this doesnt appear to be documented anywhere in requests; see #3805 (comment). |
It looks like requests now supports |
Optional: if a transient error occurred, such as 500+, the bear should be disabled with a warning. However this could complexity can be deferred to be added later in #3332 |
I would like to be assigned this issue |
What value of max_retries is expected? Also what should be an optimal backoff_factor value for the retries? |
If max_retries is hit and the error code persists, a RetryError is raised which is causing the pytest to fail. How do I handle the pytest failing? |
How about handling the RetryError by attempting to download once more(in the except statement) so that HTTPError can be raised with the latest status code |
Please review the PR and suggest changes |
Follows on from #3803
The interface of
Bear.download_cached_file
isnt suited to failure. It needs to retry if possible.The text was updated successfully, but these errors were encountered: