New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add anthropic wrapper for auto tracing #664
Comments
Ya it'll go here - langsmith-wrappers was my experiment bed from last year. Will take this as a TODO - are you using python or JS? |
Using Python :) |
Need this also with python |
Sweet will try to get it out when I can steal a moment |
Are we talking about a day or a week timeline? Trying to decide if I should move to another platform or write my own. |
If you need something today, this already works: from anthropic import Anthropic
from langsmith import traceable
anthropic_client = Anthropic()
def reduce(texts: list):
return {"output": "".join(texts)}
@traceable(run_type="llm", reduce_fn=reduce)
def call_anthropic(system: str, messages: list, model: str, max_tokens: int = 4000):
with anthropic_client.messages.stream(
messages=messages, model=model, max_tokens=max_tokens
) as stream:
for text in stream.text_stream:
yield text # example
for chunk in call_anthropic(
system="You are a helpful bot",
messages=[
{
"role": "user",
"content": "Say hello and solve the Riemann hypothesis for me.",
}
],
model="claude-3-haiku-20240307",
):
print(chunk, end="") |
+1 - Yes please ! This would be great for typescript also - we're about to soft launch our product which uses claude3 sonnet on the backend. :) |
Feature request
Hi Team, it would be great to add an anthropic wrapper for auto tracing. This issue might have to move to /langsmith-wrappers repo but there seems to be little activity there.
Motivation
Anthropic has some of the best LLMs at the moment and many developers use a mixture of Openai and anthropic in their applications.
The text was updated successfully, but these errors were encountered: