-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Langfuse integration not supporting Ollama Chat API #3544
Labels
bug
Something isn't working
Comments
Viktor2k
changed the title
[Bug]:
[Bug]: Langfuse integration not supporting Ollama Chat API
May 9, 2024
I was able to replicate it. The message is not Message object in case of chat but dict. Looking into it. |
krrishdholakia
added a commit
that referenced
this issue
May 13, 2024
Fixes #3544 based on the data-type of message
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What happened?
Using Ollama to run my models and currently trying to integrate it towards langfuse but run into an issue when using the recommended ollama_chat API. Here's a minimal example with the error where the only thin I change between the two completion requests is the provider prefix from
ollama_chat
toollama
.Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: