New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] [AiServices] OllamaStreamingChatModel partial response deserialization throws MalformedJsonException #1013
Comments
Hi, it should be already fixed by this. <dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-ollama</artifactId>
<version>0.31.0-SNAPSHOT</version>
</dependency> |
Hi @langchain4j, I've checked the snapshot version but it still does not work. |
Hi @kchaber, thanks for the heads up! |
@kchaber please close this issue if it worked for you 🙏 |
@langchain4j yes, that works correctly now |
Describe the bug
The
com.google.gson.stream.MalformedJsonException
exception is thrown duringTokenStream
consumption when using theOllamaStreamingChatModel
through theAiServices
.Log and Stack trace
To Reproduce
Expected behavior
No exception is thrown when processing streaming response.
Please complete the following information:
Additional context
The exception is thrown in the
OllamaClient
when the partial response does not contain valid json as presented in the below debug:The text was updated successfully, but these errors were encountered: