We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
We calculate token_count as part of StartPromptEvent and FinishPromptEvent Events here.
token_count
StartPromptEvent
FinishPromptEvent
This can be improved in the following ways:
BasePromptDriver
The text was updated successfully, but these errors were encountered:
No branches or pull requests
We calculate
token_count
as part ofStartPromptEvent
andFinishPromptEvent
Events here.This can be improved in the following ways:
StartPromptEvent
, we should calculate this value once in theBasePromptDriver
and then pass the value to subclasses.FinishPromptEvent
, many of the LLMs will return the token count in the response. We should not calculate it ourselves unless the value is missing.The text was updated successfully, but these errors were encountered: