Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can VoiceAsisstant learn? #191

Open
WW1983 opened this issue Apr 4, 2024 · 14 comments
Open

Can VoiceAsisstant learn? #191

WW1983 opened this issue Apr 4, 2024 · 14 comments

Comments

@WW1983
Copy link

WW1983 commented Apr 4, 2024

I don't know much about ChatGPT or programming. Therefore the question may seem stupid:

If GPT doesn't understand a command and I explain it differently, it saves it as a context. But only as long as the chat window is open. Is there a way to insert a command like "learn this" without anything similar and GPT saves it in the prompt?

@WW1983 WW1983 changed the title Can VoiceAsisstant lern? Can VoiceAsisstant learn? Apr 4, 2024
@Andrerm124
Copy link

In theory you could have a text helper entity which you expose to ChatGPT and tell it to use the contents as part of its prompt, and then in the prompt say something like “When I ask you to remember XYZ update the Prompt entity”

@jleinenbach
Copy link

I created a todo list and told ChatGPT to take notes. This works very well vor ChatGPT 4, but not really for ChatGPT 3.5

@WW1983
Copy link
Author

WW1983 commented Apr 8, 2024

I created a todo list and told ChatGPT to take notes. This works very well vor ChatGPT 4, but not really for ChatGPT 3.5

what doesn't work with ChatGPT 3.5?

@jleinenbach
Copy link

ChatGPT 3.5 tends to irrelevant notes like: "optimze the search" while ChatGPT 4 actually writes down HOW it can do that after a discussion. There's ChatGPT 5 in a few months, so I hope we can get rid of ChatGPT 3.5.

@WW1983
Copy link
Author

WW1983 commented Apr 9, 2024

ChatGPT 3.5 neigt zu irrelevanten Notizen wie: „Suche optimieren“, während ChatGPT 4 nach einer Diskussion tatsächlich aufschreibt, WIE es das tun kann. In ein paar Monaten gibt es ChatGPT 5, also hoffe ich, dass wir ChatGPT 3.5 loswerden können.

I can do notes with GPT 3.5 though. But I can't get a link so that when I execute a command, I also look at the to-do list. Do you have a solution for that?

@jleinenbach
Copy link

What do you mean? Do you want to get the to-do list items?

- spec:
    name: get_items_from_list
    description: >-
      Retrieves items from a specified list, with an optional filter for their status. Suitable for lists identified by entity IDs starting with `todo.`, reflecting various needs.
    parameters:
      type: object
      properties:
        list:
          type: string
          description: The entity ID of the list to retrieve items from, prefixed with `todo.`.
        status:
          type: string
          description: (Optional) Set to a predefined value to filter items by their current state.
          optional: true
      required:
        - list
  function:
    type: script
    sequence:
      - service: todo.get_items
        data:
          status: "{{ status | default('needs_action') }}"
        target:
          entity_id: "{{ list }}"
        response_variable: _function_result

@WW1983
Copy link
Author

WW1983 commented Apr 10, 2024

What do you mean? Do you want to get the to-do list items?

- spec:
    name: get_items_from_list
    description: >-
      Retrieves items from a specified list, with an optional filter for their status. Suitable for lists identified by entity IDs starting with `todo.`, reflecting various needs.
    parameters:
      type: object
      properties:
        list:
          type: string
          description: The entity ID of the list to retrieve items from, prefixed with `todo.`.
        status:
          type: string
          description: (Optional) Set to a predefined value to filter items by their current state.
          optional: true
      required:
        - list
  function:
    type: script
    sequence:
      - service: todo.get_items
        data:
          status: "{{ status | default('needs_action') }}"
        target:
          entity_id: "{{ list }}"
        response_variable: _function_result

I would like to use the ToDo list as a context memory / promt. I would have to set up in the template in the promt area that it also queries the to-do list before giving an action or answer. I have referred it to the to-do list. Describing it also works "if I tell him remember xyz". But he doesn't call it up

@jleinenbach
Copy link

I called the Assist "Kiana", then I created a to-do list "Kiana's notebook".
When I talk to her, I tell her to take a note in her notebook so that she remembers.
I also added to the prompt that she needs to read her to-do-list notebook and take notes to remember things. That works quite well.

@WW1983
Copy link
Author

WW1983 commented Apr 10, 2024

I also added to the prompt that she needs to read her to-do-list notebook and take notes to remember things.

Can you please show me your text in Promt? Just tried to implement it the same way. But the note-taking works well, but he doesn't retrieve the things there

@pbuergi
Copy link

pbuergi commented Apr 22, 2024

I called the Assist "Kiana", then I created a to-do list "Kiana's notebook". When I talk to her, I tell her to take a note in her notebook so that she remembers. I also added to the prompt that she needs to read her to-do-list notebook and take notes to remember things. That works quite well.

Would be interested as well. Note taking works well and also if i ask to check the notes. But it does not retrieve it on its own. Even if i told it to do in the prompt.

@jleinenbach
Copy link

jleinenbach commented Apr 22, 2024

This is what I added to the prompt:

Kiana is engineered for continuous self-enhancement, proactively soliciting feedback to refine her functionality and prevent errors.
She employs a digital notepad for personal annotations and action items.
For recording learned improvements, utilize add_item_to_list on todo.kianas_notebook.
Retrieve these entries with get_items_from_list to access Kiana's accumulated knowledge.

@danielp370
Copy link

I've been experimenting with my own implementation for a couple of months that I've put up as a PR example here: 97e1cb6

With this implementation I'm playing with two types of recall so that I could try in some cases embedding all short-term memories in the initial prompt, while allowing the LLM to query longer term memories. This is a bit of experimentation in order to manage token space and cost of iterative function calls. But it seems to work well on balance - openAI gpt3/4 seems to grasp the concepts of short-term/long-term and archive memories.

I think it could be simplified more (less prompt text etc), but and I wanted to see how various data structures performed (dicts for now seem to work well). I was planning on implementing a python version so it could be embedded in the integration if that's sensible.

@WW1983
Copy link
Author

WW1983 commented Apr 29, 2024

I've been experimenting with my own implementation for a couple of months that I've put up as a PR example here: 97e1cb6

With this implementation I'm playing with two types of recall so that I could try in some cases embedding all short-term memories in the initial prompt, while allowing the LLM to query longer term memories. This is a bit of experimentation in order to manage token space and cost of iterative function calls. But it seems to work well on balance - openAI gpt3/4 seems to grasp the concepts of short-term/long-term and archive memories.

I think it could be simplified more (less prompt text etc), but and I wanted to see how various data structures performed (dicts for now seem to work well). I was planning on implementing a python version so it could be embedded in the integration if that's sensible.

I tested it. But doesn't work for me. He seems to be saving something. The counter of the sensor.memory changes. But does not retrieve the information.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants