Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use this with local model #129

Open
amztc34283 opened this issue Apr 3, 2024 · 4 comments
Open

Use this with local model #129

amztc34283 opened this issue Apr 3, 2024 · 4 comments

Comments

@amztc34283
Copy link

How can I use this with local model?

@eyurtsev
Copy link
Collaborator

eyurtsev commented Apr 4, 2024

Can't without making some changes, we're using: https://python.langchain.com/docs/modules/model_io/chat/structured_output
Should be possible as we add support for local models.

Until then your best bet is a parsing approach, so you'd need to re-write some of the code in the service to use a parsing approach.

@amztc34283
Copy link
Author

Thanks, I asked question about this function, I could probably copy it from the partner folder.

@amztc34283
Copy link
Author

I could also create a PR if this is something you want.

@gaj995
Copy link

gaj995 commented Apr 17, 2024

@amztc34283 Were you able to set it up with a local model? I wanna test mistral model through ollama, any idea on this implementation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants