You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we run a simple prompt in OI, the command window can rapidly get filled with iterations of the entire stream over and over. I've attached pics to better explain.
My guess is when OI sends the chunk to the CMD window it is running in, it doesn't just add the current chunk that is being streamed, it adds the entire set of chunks for the current call to open-ai.
This behavior is consistent and can be repeated easily by ask OI to do something that involves multiple steps and watch the output.
One approach to resolve this could be to stream the returning chunks to a file and then use a python package like watchdog to just write to the CMD the new chunks as they arrive (appending to the content that is already there).
Reproduce
Run OI and ask it to do any medium to long type of multi step task especially if you ask it to write a program.
Expected behavior
Smooth UI/UX where only the latest chunks are appended to the current output.
Screenshots
Open Interpreter version
.0.2.0
Python version
3.11.4
Operating System name and version
windows 11
Additional context
No response
The text was updated successfully, but these errors were encountered:
Would need to incorporate curses or something similar to fix this.
I am working on a replacement for the terminal_interface that fixes this: #976 it is more GUI like
Describe the bug
When we run a simple prompt in OI, the command window can rapidly get filled with iterations of the entire stream over and over. I've attached pics to better explain.
My guess is when OI sends the chunk to the CMD window it is running in, it doesn't just add the current chunk that is being streamed, it adds the entire set of chunks for the current call to open-ai.
This behavior is consistent and can be repeated easily by ask OI to do something that involves multiple steps and watch the output.
One approach to resolve this could be to stream the returning chunks to a file and then use a python package like watchdog to just write to the CMD the new chunks as they arrive (appending to the content that is already there).
Reproduce
Run OI and ask it to do any medium to long type of multi step task especially if you ask it to write a program.
Expected behavior
Smooth UI/UX where only the latest chunks are appended to the current output.
Screenshots
Open Interpreter version
.0.2.0
Python version
3.11.4
Operating System name and version
windows 11
Additional context
No response
The text was updated successfully, but these errors were encountered: