-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can VoiceAsisstant learn? #191
Comments
In theory you could have a text helper entity which you expose to ChatGPT and tell it to use the contents as part of its prompt, and then in the prompt say something like “When I ask you to remember XYZ update the Prompt entity” |
I created a todo list and told ChatGPT to take notes. This works very well vor ChatGPT 4, but not really for ChatGPT 3.5 |
what doesn't work with ChatGPT 3.5? |
ChatGPT 3.5 tends to irrelevant notes like: "optimze the search" while ChatGPT 4 actually writes down HOW it can do that after a discussion. There's ChatGPT 5 in a few months, so I hope we can get rid of ChatGPT 3.5. |
I can do notes with GPT 3.5 though. But I can't get a link so that when I execute a command, I also look at the to-do list. Do you have a solution for that? |
What do you mean? Do you want to get the to-do list items?
|
I would like to use the ToDo list as a context memory / promt. I would have to set up in the template in the promt area that it also queries the to-do list before giving an action or answer. I have referred it to the to-do list. Describing it also works "if I tell him remember xyz". But he doesn't call it up |
I called the Assist "Kiana", then I created a to-do list "Kiana's notebook". |
Can you please show me your text in Promt? Just tried to implement it the same way. But the note-taking works well, but he doesn't retrieve the things there |
Would be interested as well. Note taking works well and also if i ask to check the notes. But it does not retrieve it on its own. Even if i told it to do in the prompt. |
This is what I added to the prompt:
|
I've been experimenting with my own implementation for a couple of months that I've put up as a PR example here: 97e1cb6 With this implementation I'm playing with two types of recall so that I could try in some cases embedding all short-term memories in the initial prompt, while allowing the LLM to query longer term memories. This is a bit of experimentation in order to manage token space and cost of iterative function calls. But it seems to work well on balance - openAI gpt3/4 seems to grasp the concepts of short-term/long-term and archive memories. I think it could be simplified more (less prompt text etc), but and I wanted to see how various data structures performed (dicts for now seem to work well). I was planning on implementing a python version so it could be embedded in the integration if that's sensible. |
I tested it. But doesn't work for me. He seems to be saving something. The counter of the sensor.memory changes. But does not retrieve the information. |
I don't know much about ChatGPT or programming. Therefore the question may seem stupid:
If GPT doesn't understand a command and I explain it differently, it saves it as a context. But only as long as the chat window is open. Is there a way to insert a command like "learn this" without anything similar and GPT saves it in the prompt?
The text was updated successfully, but these errors were encountered: