Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 Feature: log variable names in prompts #389

Open
1 task done
mohitk1995 opened this issue Feb 2, 2024 · 1 comment
Open
1 task done

🚀 Feature: log variable names in prompts #389

mohitk1995 opened this issue Feb 2, 2024 · 1 comment

Comments

@mohitk1995
Copy link

Which component is this feature for?

OpenAI Instrumentation

🔖 Feature description

Want to log a variable's value in the trace of llm. Where llm is called from a async task. Currently the association_properties are shared by all the tasks.
So basically a way to report a variable's value.

🎤 Why is this feature needed ?

To add more details to the LLM trace that are specific to each llm call made in the same context.

✌️ How do you aim to achieve this?

Give a way to log variables inside a llm traces. This should only be set for local scope of the function or the linked functions/processes. Should not be updated globally.

🔄️ Additional Information

I tried setting the association_properties within each task those were called from a common parent. But the variable's value was getting overwritten globally. Each time I was getting the most recent value.

👀 Have you spent some time to check if this feature request has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

None

@nirga nirga changed the title 🚀 Feature: 🚀 Feature: log variable names in prompts Feb 2, 2024
@varaarul
Copy link

I'll look into this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants