Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: context_length_exceeded issue & Max History Message Size setting DOES NOT seem to WORK #96

Open
1 task done
ailoha opened this issue Sep 11, 2023 · 1 comment
Open
1 task done
Labels
use issue when using `anse.app`

Comments

@ailoha
Copy link

ailoha commented Sep 11, 2023

What operating system are you using?

Mac

What browser are you using?

Safari, Chrome

Describe the bug

When I use the OpenAI API to chat for a period of time, the problem of exceeding the token limit occurs. Even after clearing all history, the problem STILL exists.

When switching to another newly created blank conversation, it seems that the historical messages of other conversations will also be sent together. The error that the previous conversation exceeded the token limit will affect this new conversation.

The Max History Message Size setting DOES NOT seem to take effect.

截屏2023-09-15 下午12 06 00

What provider are you using?

OpenAI

What prompt did you enter?

(No matter what is ...)

Console Logs

Error: This model's maximum context length is 4097 tokens. However, you requested 16332 tokens (13 in the messages, 16319 in the completion). Please reduce the length of the messages or completion.
at (entry.mjs:63:6060)
at (entry.mjs:63:18548)
at (entry.mjs:63:18705)
at (entry.mjs:48:1281)
at (entry.mjs:58:56367)
at (entry.mjs:100:617)

Participation

  • I am willing to submit a pull request for this issue.
@ailoha ailoha added the use issue when using `anse.app` label Sep 11, 2023
@ailoha ailoha changed the title context_length_exceeded BUG: context_length_exceeded & Max History Message Size setting does NOT seem to WORK Sep 15, 2023
@ailoha ailoha changed the title BUG: context_length_exceeded & Max History Message Size setting does NOT seem to WORK BUG: context_length_exceeded issue & Max History Message Size setting DOES NOT seem to WORK Sep 15, 2023
@LyuLumos
Copy link
Contributor

gpt-3.5-turbo only support up to 4096 tokens. It is not concerning the MAX HISTORY MESSAGE SIZE. You just need to set MAX TOKENS to a value lower than 4096(e.g. 2048?), it will work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
use issue when using `anse.app`
Projects
None yet
Development

No branches or pull requests

2 participants