Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add anthropic wrapper for auto tracing #664

Open
NathanHam16 opened this issue May 7, 2024 · 7 comments
Open

Add anthropic wrapper for auto tracing #664

NathanHam16 opened this issue May 7, 2024 · 7 comments

Comments

@NathanHam16
Copy link

Feature request

Hi Team, it would be great to add an anthropic wrapper for auto tracing. This issue might have to move to /langsmith-wrappers repo but there seems to be little activity there.

Motivation

Anthropic has some of the best LLMs at the moment and many developers use a mixture of Openai and anthropic in their applications.

@hinthornw
Copy link
Collaborator

Ya it'll go here - langsmith-wrappers was my experiment bed from last year. Will take this as a TODO - are you using python or JS?

@NathanHam16
Copy link
Author

Using Python :)

@hackgoofer
Copy link

hackgoofer commented May 8, 2024

Need this also with python

@hinthornw
Copy link
Collaborator

Sweet will try to get it out when I can steal a moment

@hackgoofer
Copy link

Are we talking about a day or a week timeline? Trying to decide if I should move to another platform or write my own.

@hinthornw
Copy link
Collaborator

hinthornw commented May 8, 2024

If you need something today, this already works:

from anthropic import Anthropic
from langsmith import traceable

anthropic_client = Anthropic()


def reduce(texts: list):
    return {"output": "".join(texts)}


@traceable(run_type="llm", reduce_fn=reduce)
def call_anthropic(system: str, messages: list, model: str, max_tokens: int = 4000):
    with anthropic_client.messages.stream(
        messages=messages, model=model, max_tokens=max_tokens
    ) as stream:
        for text in stream.text_stream:
            yield text
# example

for chunk in call_anthropic(
    system="You are a helpful bot",
    messages=[
        {
            "role": "user",
            "content": "Say hello and solve the Riemann hypothesis for me.",
        }
    ],
    model="claude-3-haiku-20240307",
):
    print(chunk, end="")

@Mann1ng
Copy link

Mann1ng commented May 15, 2024

+1 - Yes please ! This would be great for typescript also - we're about to soft launch our product which uses claude3 sonnet on the backend. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants