-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LangChain Integration #517
Labels
feature
New feature or request
Comments
Hey @slavakurilyak - I'd love to hear more about this LangChain integration idea. From our testing with LangChan JS, we've explored using it by primarily wrapping any call from LangChain within a What type of integration would you like to see? How might that work for you? export const basicChain = inngest.createFunction(
{ name: "Basic Chain" },
{ event: "ai/basic.chain" },
async ({ event, step }) => {
// Get the input data from the event payload
const product = event.data.product;
const model = new OpenAI({ temperature: 0 });
const prompt = PromptTemplate.fromTemplate(
"What is a good name for a company that makes {product}?"
);
const chainA = new LLMChain({ llm: model, prompt });
const result = await step.run("First prompt", async () => {
return await chainA.call({ product });
});
return { message: "success" };
}
); |
IGassmann
pushed a commit
that referenced
this issue
Feb 14, 2024
* Debounce docsg * Add debounce to nav * Add reference img * Update use cases * Center img * Move how it works * Update debounce how it works
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is your feature request related to a problem? Please describe.
There is currently no straightforward way to run LangChain-supported models in serverless environments without dealing with infrastructure or state concerns. This is a challenge because it creates a barrier for developers who want to deploy and manage LLM and chat model applications in serverless environments.
Describe the solution you'd like
I propose the integration of LangChain into Inngest. This would allow developers to run LangChain-managed language models in a serverless environment, handling all infrastructure and state concerns automatically. This would greatly simplify the process of deploying and managing LLM applications in serverless environments.
Describe alternatives you've considered
Alternatives to LangChain include LlamaIndex (previously known as GPT Index)
Additional context
Inngest's blog post on May 16, 2023, highlighted their interest in integrating with LangChain to allow people to run LangChain models in serverless environments. Given Inngest's mission to simplify and automate serverless workflows, and LangChain's goal to enable developers to build LLM applications, this integration seems like a natural fit.
The text was updated successfully, but these errors were encountered: