New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Easy] Thoughts on preventing send additional prompts when the length of a chat is longer than 5. #449
Comments
Thanks for this request. It's a good idea indeed. |
Hey, so I did some hack last month and get some progresses. |
@haopengsong I've released today the newest version of big-AGI, which has cost estimation for various models. Let's look at the options:
|
THanks for the update! Great progresses on the new features. Really like the cost estimation.
|
Thanks for the responses, I'm referencing #480 to keep track of those. |
Hello,
Correct me if I'm wrong.
Since when interacting with the openai api, the whole chat in the context window would be sent. Hence, the size of the context tokens could grow way larger than the generated tokens (pic below). In my case, it may have inccured a huge unwanted costs.
It would be nice to limit the length of each chat (could be 5 or etc).
Need some help in providing throughts on where to make changes that may allow such feature...
The text was updated successfully, but these errors were encountered: