Enhancement: Introduce the feature to invoke ollama through API on a remote machine #2583
derhelge
started this conversation in
Feature Requests & Suggestions
Replies: 1 comment 1 reply
-
This can be done a myriad of ways and is more of a networking challenge rather than something to be implemented natively to LibreChat. You may be better off asking or searching in the LiteLLM project as they handle proxying of services. I've been able to do this myself in a small use case through SSH tunneling, but you would probably want something more robust. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What features would you like to see added?
I would like to request the implemented ability for ollama to be run on a remote server via API. This comes from the need to leverage more computational power of individual models. An nginx reverse proxy is used on the server to handle Bearer Token Authentication. To have this work effectively, necessary adaptations need to be made to librechat. Your assistance in this aspect is appreciated.
More details
I think, that ModelService and the documentation should be edited.
Which components are impacted by your request?
No response
Pictures
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions