Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUESTION] Tool Agent support for chatollama #2383

Closed
KarthickMani87 opened this issue May 10, 2024 · 16 comments · Fixed by #2403
Closed

[QUESTION] Tool Agent support for chatollama #2383

KarthickMani87 opened this issue May 10, 2024 · 16 comments · Fixed by #2403
Labels
enhancement New feature or request in-work Issue in work

Comments

@KarthickMani87
Copy link

There is a use case where I want to post URL API as soon as the user enters his name and phone number. This is working fine when I use chatOpenAI. I am attaching the screenshot for reference. This is the working use case. You can see the webhook post has been sent successfully.

image

I am trying the same flow except chatOpenAI has been replaced with with ollama running locally. But I am getting following error "Error: This agent requires that the "bindTools()" method be implemented on the input model.".

image

I am aware the tool agents supports models that supports function calling (such as chatOpenAI chatMistral). Any suggestions would be helpful.
Is there any other way the same use case can be accomplished with ollama running locally?

Warm Regards,
Karthick

@KarthickMani87 KarthickMani87 changed the title [FEATURE] Tool Agent support for chatollama May 10, 2024
@KarthickMani87 KarthickMani87 changed the title Tool Agent support for chatollama [QUESTION] Tool Agent support for chatollama May 10, 2024
@HenryHengZJ
Copy link
Contributor

for ollama, we need to add the integration dedicated for function calling - https://js.langchain.com/docs/integrations/chat/ollama_functions

@HenryHengZJ HenryHengZJ added the enhancement New feature or request label May 10, 2024
@snailbrainx
Copy link
Sponsor

Ah great, good to know I'm not the only one.

Is there an ETA on this. Happy to donate $50. Only for learning/testing atm.

@napa3um
Copy link

napa3um commented May 11, 2024

Workarround: use ChatLocalAI block for Ollama with base url http://host.docker.internal:11434/v1, but this is without streaming (or streaming is not supported for tools with any provider?)

@snailbrainx
Copy link
Sponsor

http://host.docker.internal:11434/v1

this doesn't seem to use the calculator tool for example:

image
image

illama itself works. it's the agent tools that is not.

@KarthickMani87
Copy link
Author

@snailbrainx ,

Yes, I have tried too. It does not look like it used the tools "calculator."

@KarthickMani87
Copy link
Author

@HenryHengZJ ,

If I understand correctly, this will be a new feature? When can we expect to see this on flowise? ETA, please?

Warm Regards,
Karthick

@snailbrainx
Copy link
Sponsor

I have sponsored $50 if that helps.

This will allow a lot of people to run the tools agent locally.

@HenryHengZJ
Copy link
Contributor

here - #2403

@HenryHengZJ HenryHengZJ added the in-work Issue in work label May 13, 2024
@HenryHengZJ HenryHengZJ linked a pull request May 13, 2024 that will close this issue
@KarthickMani87
Copy link
Author

Hi @HenryHengZJ ,

Thank you very much for the enhancement/feature. I just have to upgrade the flowise to get this feature to work?

Warm Regards,
Karthick

@KarthickMani87
Copy link
Author

here - #2403

Hi @HenryHengZJ ,

Please let me know how to use Tool System Prompt. You can give the tool system prompt #2403 for reference.

Warm Regards,
Karthick

@HenryHengZJ
Copy link
Contributor

here - #2403

Hi @HenryHengZJ ,

Please let me know how to use Tool System Prompt. You can give the tool system prompt #2403 for reference.

Warm Regards, Karthick

here - #2403

Hi @HenryHengZJ ,

Please let me know how to use Tool System Prompt. You can give the tool system prompt #2403 for reference.

Warm Regards, Karthick

here - #2403

Hi @HenryHengZJ ,

Please let me know how to use Tool System Prompt. You can give the tool system prompt #2403 for reference.

Warm Regards, Karthick

Because every LLM works differently, so the idea is for you to modify the prompt to have the LLM pick and use the tools appropriately. The PR will need to be merged in order to test it on Flowise

@KarthickMani87
Copy link
Author

Hi @HenryHengZJ ,

In the meantime, I tried cloning the branch you have been working (feature/ChatOllama) built it and ran the flowise.

This is what I give in Tool System Prompt.

image

This is the output I got (Failed to parse a function call from mistral output: {"num1": 8, "num2": 9}).
image

I will wait for the merged code.

Warm Regards,
Karthick

@HenryHengZJ
Copy link
Contributor

@KarthickMani87 its generally not recommended to modify the tool prompt, i've added more description to it:
image

Under the hood, Ollama's JSON mode is being used to constrain output to JSON. Output JSON will contains two keys: tool and tool_input fields. We then parse it to execute the tool. Because different models have different strengths, it may be helpful to pass in your own system prompt.

@KarthickMani87
Copy link
Author

Hi @HenryHengZJ , Thanks again for completing this feature.

@KarthickMani87
Copy link
Author

Hi @HenryHengZJ ,

I have a question. In this flow, webhook got triggered only once when a name and phone number were received from a user.

image

But in the case of ollamaFunction, webhook is getting triggered every time. How to make it call the tool only for a specific condition in the chat? For example, I want to post URL only for name and phone number and the rest of the time it should just answer my questions.

@napa3um
Copy link

napa3um commented May 18, 2024

How to make it call the tool only for a specific condition in the chat?

use Conversational Agent instead Tool Agent

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request in-work Issue in work
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants