Replies: 1 comment
-
Hey @renaisser! Fancy seeing you here again. Hope everything else is going well besides this little hiccup! Based on the information provided in the LlamaIndex repository, you can view the tokens consumed when calling the Here's an example of how you can access this information after making a call to any of the completion methods: response = anthropic_multi_modal.complete(prompt="Your prompt here", image_documents=[])
token_usage = response.additional_kwargs
print(token_usage) This will print a dictionary with the token usage information, such as: {
"prompt_tokens": 10,
"completion_tokens": 40,
"total_tokens": 50
} This approach allows you to view the tokens consumed for the operations performed via the Please note that the The For more information on how to retrieve token usage information, you can refer to the I hope this helps! If you have any further questions, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
Is there any way to see the tokens used when you invoke the MultiModalLLMCompletionProgram method through AnthropicMultiModal ?
PS: I'm trying to use LangFuse or TraceLoop, but no works.
Any help? Thanks a lot
Beta Was this translation helpful? Give feedback.
All reactions