Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Let me know what you'd like to do next" repeats after answering the question and doesnt stop #259

Open
contractorwolf opened this issue Apr 26, 2024 · 6 comments

Comments

@contractorwolf
Copy link

Describe the bug
I was previously running on the older version of the 01 server and updated everything a few days ago when the latest (0.2.5) was release. After updating I first tested using my Atom device. It answers the initial question but i can see in the terminal output it gets into a loop after the question and just starts outputting like this until I kill it:

macbook-pro-3:software jameswopoetry run 01

○                                                                                                                                                            

Starting...                                                                                                                                                  

INFO:     Started server process [42881]
INFO:     Waiting for application startup.

Ready.                                                                                                                                                       

INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:10001 (Press CTRL+C to quit)
INFO:     ('127.0.0.1', 55591) - "WebSocket /" [accepted]
INFO:     connection open

Hold the spacebar to start recording. Press CTRL-C to exit.
 Recording started...
           Recording stopped.
audio/wav /var/folders/xt/5d_33m2d4mxdlx01g3x9g9br0000gn/T/input_20240424223446654411.wav /var/folders/xt/5d_33m2d4mxdlx01g3x9g9br0000gn/T/output_20240424223446656750.wav
>  Audio test.

                                                                                                                                                             
  Your audio is working fine.                                                                                                                                
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.                                                                                                                    
                                                                                                                                                             
  Let me know what you'd like to do next.●                                                                                                                   
                                                                                                                                                             ^CKilled: 9

This (above) is the output from retesting with just using the spacebar (to eliminate the device as a possible source of the issue. Not sure what to try next. Rebooted, installed again on a new conda environment, same issue.

Desktop (please complete the following information):

  • OS: M1 Macbook
  • Python Version 3.10 (even tried on 3.9)

Any help or suggestions appreciated!

@highb
Copy link

highb commented Apr 29, 2024

I'm also experiencing this issue with Ubuntu 22.04 and Python 3.10.12.

@contractorwolf
Copy link
Author

well at least i am not the only one. Have you tried rolling back to a previous version? This sucks for me because I have an awesome idea for a "device case" that I want to demo but i need a working server first. Let me know if you figure something out @highb, ill do the same.

@highb
Copy link

highb commented Apr 30, 2024

well at least i am not the only one. Have you tried rolling back to a previous version? This sucks for me because I have an awesome idea for a "device case" that I want to demo but i need a working server first. Let me know if you figure something out @highb, ill do the same.

Unfortunately, I haven't had time to tinker with it recently. The Discord has some discussions about similar issues if you want to check them out.

@ai-Ev1lC0rP
Copy link

I"m having the same issue, granted... i'm only trying to run locally while i troubleshoot which might be part of my problem. I'm running either command-r (not plus) or Llama70B or Mixtral8x7B-V2.8. Plus using local whisper, plus using piper, plus running ollama on a separate server, plus trying to run the moble app (Which i got it working!) but it's.... in development and i never expected it to just... WORK especially since i'm tyring to just create a shortcut for my side button.

I get similar results. but i think part of that as well is the system prompt that ollama might be passing due to the fact i have OpenWebUI running on that same server to be able to craft models.

"It seems we're encountering similar issues, which I initially attributed to a poorly configured system or corrupted memory. Specifically, it appears to reference a Windows 7 computer, suggesting it may be necessary for utilizing the computer module."Then i had to convince it that it just got done... on my mac... (I dont even own a windows 7 computer nor have i even had a reason to have it in ANY context window)

It does seem to make a dramatic impact if you get the context window right or wrong or if you pass too much it tends to do things that no matter the model (i've tried a bunch) If you are trying to showcase it, i'd say keep it simple, but be specific about the simplicity. . . .if that makes any bit of sense. I do like how you can do the %save_message and %load_message however i think i am running into trying to pass too much context again. I'm currently trying to get it to teach itself how to use AIFS/Chroma which... should just work but it seems to be an ongoing issue and conversation.

Not here, just between me and the LLM. ha!

@contractorwolf
Copy link
Author

contractorwolf commented May 5, 2024 via email

@g3ar-v
Copy link

g3ar-v commented May 6, 2024

I found out it was an open-interpreter issue. The force_task_completion messages are not correctly checked in respond.py. There should be a lower function that converts the messages.

 and not any(
                    task_status.lower()
                    in interpreter.messages[-1].get("content", "").lower()
                    for task_status in force_task_completion_breakers
                )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants