New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚀 Feature: log variable names in prompts #389
Comments
I'll look into this |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Which component is this feature for?
OpenAI Instrumentation
🔖 Feature description
Want to log a variable's value in the trace of llm. Where llm is called from a async task. Currently the association_properties are shared by all the tasks.
So basically a way to report a variable's value.
🎤 Why is this feature needed ?
To add more details to the LLM trace that are specific to each llm call made in the same context.
✌️ How do you aim to achieve this?
Give a way to log variables inside a llm traces. This should only be set for local scope of the function or the linked functions/processes. Should not be updated globally.
🔄️ Additional Information
I tried setting the association_properties within each task those were called from a common parent. But the variable's value was getting overwritten globally. Each time I was getting the most recent value.
👀 Have you spent some time to check if this feature request has been raised before?
Are you willing to submit PR?
None
The text was updated successfully, but these errors were encountered: