Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate with Langchain #36

Open
Sharpz7 opened this issue Aug 1, 2023 · 2 comments
Open

Integrate with Langchain #36

Sharpz7 opened this issue Aug 1, 2023 · 2 comments

Comments

@Sharpz7
Copy link

Sharpz7 commented Aug 1, 2023

Just wanted to add this here.

langchain-ai/langchain#8563

Ideally we would write it into langchain in a way that the user can choose the URL of the endpoint they want to use (Since its not recommended to use chat.petals.dev)

@borzunov
Copy link
Member

borzunov commented Sep 27, 2023

For the record, there is an existing integration by Langchain devs that runs the native Petals client: https://python.langchain.com/docs/integrations/llms/petals This connects to the swarm directly (without using this API endpoint), but requires downloading input/output embeddings of the model (a few GB) before running.

@Sharpz7, do you think Petals API support will be still helpful, even though Langchain provides integration for the native client?

@Sharpz7
Copy link
Author

Sharpz7 commented Sep 27, 2023

Hmm. I think for applications where space is limited potentially? I think having more options rather than less is still useful to have.

Also just to lower the barrier to entry, that is something that using the petals API directly would give you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants