-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Return prompt_tokens via API #162
Labels
Comments
Thanks for raising this issue @prd-tuong-nguyen, we can definitely get this in for the next release! |
Hey @prd-tuong-nguyen, this is now done following #165. Note that the Python SDK will be updated in PyPI with the next release. Until then, you can install from the repo to get the latest changes. |
@tgaddair Awesome bro, thank you so much |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Feature request
Currently, the API only returned the generated_tokens.
I think the API should return the prompt_tokens as number of token in requested prompt. It really useful to tracking the response speed according to the number of tokens.
Motivation
Use to tracking the speed according the number of token
Your contribution
:(
The text was updated successfully, but these errors were encountered: