New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comsumption with OpenAI #109
Comments
Hey! Here is a example on how to get the usage values in in LLPhant:
This is the generateText function in OpenAiChat.php You can access the usage values from the response. Hope this helped! |
Thank you. |
No problem hope it helped! Well i would love to contribute to the library. If i get permission from owner to do it, i will add it :) |
Hey @samuelgjekic, I would love to have a contribution from you on this! |
@ClicShopping, @samuelgjekic, @MaximeThoonsen I provided the PR #116 to store the last response from OpenAI. In this way we can have the token usage including also the other response objects. |
It' done 🚀 |
Hello,
Do you include this point inside LLphant to know to consumption in response for OpenAI ?.
I do not find it ?
example
`
`
The text was updated successfully, but these errors were encountered: