Skip to content

Latest commit

 

History

History
95 lines (69 loc) · 2.55 KB

langchain.mdx

File metadata and controls

95 lines (69 loc) · 2.55 KB
title
Langchain

The Langchain integration enables to monitor your Langchain agents and chains with a single line of code.

You should create a new instance of the callback handler for each invocation.

```python Python import os from literalai import LiteralClient

from langchain_openai import ChatOpenAI from langchain.schema.runnable.config import RunnableConfig from langchain.schema import StrOutputParser from langchain.prompts import ChatPromptTemplate

literal_client = LiteralClient(api_key=os.getenv("LITERAL_API_KEY"))

cb = client.langchain_callback()

prompt = ChatPromptTemplate.from_messages( ['human', 'Tell me a short joke about {topic}'] )

model = ChatOpenAI(streaming=True) runnable = prompt | model | StrOutputParser()

res = runnable.invoke( {"topic": "ice cream"}, config=RunnableConfig(callbacks=[cb], run_name="joke") )


```typescript TypeScript
import { LiteralClient } from '@literalai/client';

import { StringOutputParser } from '@langchain/core/output_parsers';
import { ChatPromptTemplate } from '@langchain/core/prompts';
import { ChatOpenAI } from '@langchain/openai';

const client = new LiteralClient(process.env['LITERAL_API_KEY']); // This is the default and can be omitted

const cb = client.instrumentation.langchain.literalCallback();

// Example of using the callback
const prompt = ChatPromptTemplate.fromMessages([
    ['human', 'Tell me a short joke about {topic}']
  ]);

const model = new ChatOpenAI({});
const outputParser = new StringOutputParser();

const chain = prompt.pipe(model).pipe(outputParser);

const response = await chain.invoke(
{
    topic: 'ice cream'
},
{
    runName: 'joke',
    callbacks: [cb]
}
);

Multiple langchain calls in a single thread

You can combine the Langchain callback handler with the concept of Thread to monitor multiple langchain calls in a single thread.

```python Python import os from literalai import LiteralClient

literal_client = LiteralClient(api_key=os.getenv("LITERAL_API_KEY"))

with literal_client.thread(name="Langchain example") as thread: cb = client.langchain_callback() # Call your Langchain agent here


```typescript TypeScript
import { LiteralClient } from '@literalai/client';

const client = new LiteralClient(process.env['LITERAL_API_KEY']); // This is the default and can be omitted

const thread = await client.thread({ name: "Langchain Example" }).upsert();
const cb = client.instrumentation.langchain.literalCallback(thread.id);

// Call your Langchain agent here