Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0.1.x (Use tools instead of functions) #31

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

0.1.x (Use tools instead of functions) #31

wants to merge 3 commits into from

Conversation

jekalmin
Copy link
Owner

@jekalmin
Copy link
Owner Author

jekalmin commented Jan 1, 2024

@rkistner
Maybe I will add Use Tools option in the next release (probably 1.0.0), so that I don't have to keep track of multiple codes.

@Anto79-ops
Copy link

Anto79-ops commented Jan 19, 2024

Hi! if using tools instead of functions, does this prompt need to be changed or the functions need to removed? thanks

currently running the beta and would like to test. thanks

@jekalmin
Copy link
Owner Author

Thanks for your interest!

I released this in 1.0.2-beta1.
You don't have to change prompt or functions.

Please try it and give a feedback.

@Anto79-ops
Copy link

Anto79-ops commented Jan 20, 2024

Thank you!

I wanted to share that I've been experimenting with Beta1 using LocalAI.

With LocalAI, I have the Mixtral 8x7b (v 2.7) 6Q GGUF model setup, which is supposedly one of the best models out right now.

I pointed your integration to this model and then toggled the "use tools" button and pressed submit.

I was pleasantly surprised that by doing the above, it was able to read sensor information in my home assistant quite well with no errors.

Untitled.jpg

Untitled2.jpg

However, when I asked it to turn on a light, it seems to have gone through the actions and acknowledged that it's on, but the fact is it didn't actually turn on the light.

Untitled3.jpg

So I think using tools is moving in the right direction. However, there might be some tweaks that are required for it to actually do service calls. Do you know if that's something that can be done in the prompt?

Thank you!

@Anto79-ops
Copy link

I also use the assist to help trobleshoot, it seems to get the correct entity and correct service, so not sure why its not working as its hitting the correct service and entitiy:

Screenshot_20240120_133622_Home Assistant

Screenshot_20240120_133631_Home Assistant

Screenshot_20240120_133638_Home Assistant

@jekalmin
Copy link
Owner Author

As always, thanks @Anto79-ops for your cooperation.
Is there a log about how LLM called a function?

@Anto79-ops
Copy link

I can check if there's a way to look at the logs of LocalAI... while I send the command to turn off the light. Is that something that would be useful or do you need the home assistant integration logs?

@jekalmin
Copy link
Owner Author

I just wanted to know if function is called in message history log.
Let me try this soon.

@Anto79-ops
Copy link

fantastic. You'll be very suprised how well it works, Here is the model im using, Q6_K version

https://huggingface.co/TheBloke/dolphin-2.7-mixtral-8x7b-GGUF

its not a small model, so it may 20 to 40 seconds to reply if you don't have a decent CPU/GPU computer.

Let me know how it goes!

@ex10ded
Copy link

ex10ded commented Feb 4, 2024

LocalAI does not support function calling right now, you need to instruct your model to generate functions and parse the output.

This integration relies on the response from the openai api having the is_function_call value set, localAI models are not trained to perform this. I am investigating integration with: https://github.com/MeetKai/functionary which combined with their special VLLM server seems to be promising in its responses - but it's weak at general responses, so you really need multiple models.. tricky tricky.

@Anto79-ops
Copy link

@ex10ded Thanks for your comments. Have you been able to get the functionary v2 gguf model to work with LocalAI? It seems to require a special chat template if you use gguf (not vLLM):

mudler/LocalAI#1641

@ex10ded
Copy link

ex10ded commented Feb 5, 2024

@ex10ded Thanks for your comments. Have you been able to get the functionary v2 gguf model to work with LocalAI? It seems to require a special chat template if you use gguf (not vLLM):

mudler/LocalAI#1641

No, I hit the same issue as you - the template does not seem to work when using anything other than their special vLLM server (not even standard vllm) - they seem to do a lot of pre-processing of the tools sent even before applying a template too.

@wicol
Copy link

wicol commented Apr 25, 2024

I'm on 1.0.3 and for some reason it works when use tools off but when I turn it on it seems to REALLY want to use execute_services for getting states.. This is when using default model and prompt.
Did anyone else notice this difference? It should be the same thing, right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants