New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[QUESTION] Tool Agent support for chatollama #2383
Comments
for ollama, we need to add the integration dedicated for function calling - https://js.langchain.com/docs/integrations/chat/ollama_functions |
Ah great, good to know I'm not the only one. Is there an ETA on this. Happy to donate $50. Only for learning/testing atm. |
Workarround: use ChatLocalAI block for Ollama with base url http://host.docker.internal:11434/v1, but this is without streaming (or streaming is not supported for tools with any provider?) |
this doesn't seem to use the calculator tool for example: illama itself works. it's the agent tools that is not. |
Yes, I have tried too. It does not look like it used the tools "calculator." |
If I understand correctly, this will be a new feature? When can we expect to see this on flowise? ETA, please? Warm Regards, |
I have sponsored $50 if that helps. This will allow a lot of people to run the tools agent locally. |
here - #2403 |
Hi @HenryHengZJ , Thank you very much for the enhancement/feature. I just have to upgrade the flowise to get this feature to work? Warm Regards, |
Hi @HenryHengZJ , Please let me know how to use Tool System Prompt. You can give the tool system prompt #2403 for reference. Warm Regards, |
Because every LLM works differently, so the idea is for you to modify the prompt to have the LLM pick and use the tools appropriately. The PR will need to be merged in order to test it on Flowise |
Hi @HenryHengZJ , In the meantime, I tried cloning the branch you have been working (feature/ChatOllama) built it and ran the flowise. This is what I give in Tool System Prompt. This is the output I got (Failed to parse a function call from mistral output: {"num1": 8, "num2": 9}). I will wait for the merged code. Warm Regards, |
@KarthickMani87 its generally not recommended to modify the tool prompt, i've added more description to it: Under the hood, Ollama's JSON mode is being used to constrain output to JSON. Output JSON will contains two keys: |
Hi @HenryHengZJ , Thanks again for completing this feature. |
Hi @HenryHengZJ , I have a question. In this flow, webhook got triggered only once when a name and phone number were received from a user. But in the case of ollamaFunction, webhook is getting triggered every time. How to make it call the tool only for a specific condition in the chat? For example, I want to post URL only for name and phone number and the rest of the time it should just answer my questions. |
use Conversational Agent instead Tool Agent |
There is a use case where I want to post URL API as soon as the user enters his name and phone number. This is working fine when I use chatOpenAI. I am attaching the screenshot for reference. This is the working use case. You can see the webhook post has been sent successfully.
I am trying the same flow except chatOpenAI has been replaced with with ollama running locally. But I am getting following error "Error: This agent requires that the "bindTools()" method be implemented on the input model.".
I am aware the tool agents supports models that supports function calling (such as chatOpenAI chatMistral). Any suggestions would be helpful.
Is there any other way the same use case can be accomplished with ollama running locally?
Warm Regards,
Karthick
The text was updated successfully, but these errors were encountered: