-
Notifications
You must be signed in to change notification settings - Fork 137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming requests read full response contents before initial cache write #878
Comments
You're correct, the current level of support for streaming requests is making sure the stream can be played back correctly when returned from the cache. In other words, chunking behavior in the underlying file-like object used by I definitely agree that it would be an improvement for large requests like this if we could cache a streaming response only after it reaches the end of the stream. In general, this library isn't optimized for file downloads and other large requests, but it is something on my radar (#407). There would be a few different ways to approach this, but I can't think of any particularly clean solutions right now, and I will need to give it some more thought. Meanwhile, I will try to at least come up with a workaround you can use. |
Thanks for the example. I'm guessing |
I don't think that's related, though. Could you create a separate issue for that, please? |
Problem with streaming queries
Queries do not seem to stream appropriately. It seems the whole response is awaited by the CachedSession before releasing the handle (the example in the doc is not appropriate as it is very quick).
Expected behavior
Chunks should be parsed as soon as they are gathered.
Steps to reproduce the behavior
Workarounds
It seems the install_cache is enough to circumvent this behaviour. I checked the behaviour of requests_cache and can confirm that the
stream
argument is correctly passed to requests.Environment
1.1.0
3.9
The text was updated successfully, but these errors were encountered: