Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] add (SOCKS or HTTP) proxy support to Mistral AI client, like OpenAI and others #985

Open
ngrue opened this issue Apr 19, 2024 · 0 comments
Labels
enhancement New feature or request P2 High priority

Comments

@ngrue
Copy link

ngrue commented Apr 19, 2024

Context

For now (as of version 0.29.1), it is not possible to specify a proxy when building a client for Mistral Ai (either usual or streaming). That makes the client almost unusable behind a corporate proxy (the way to circumvent that would require to configure system environment proxy variables, jeopardizing all other http client that would not need to pass through that proxy).

Expected enhancement

Something that would look like what can be done in langchain4j-examples with the OpenAI client (and some others:
https://github.com/langchain4j/langchain4j-examples/blob/main/other-examples/src/main/java/ProxyExample.java

The idea would be to add that missing "proxy" method (here is and example of what it may look like with a streaming client) :

StreamingChatLanguageModel model = MistralAiStreamingChatModel.builder()
    .proxy(new Proxy(java.net.Proxy.Type.HTTP, new InetSocketAddress(PROXY_URL, PROXY_PORT)))
    .apiKey(API_KEY)
    .baseUrl(BASE_URL)
    .modelName(MODEL_NAME)
    .build();

Alternatives and workarounds while the feature is missing

Since there is no way to retreive (and then configure) the underlying http client before the actual http call, I did not find a way to get around that limitation.

Of course I see 2 workarounds that are not very promising

  • no use of langchain4j and basic http InputStream / outputStream implementation
  • system (jvm) proxy settings, that would create a big risk on other http clients hosted on the same jvm (would require a big overhaul of legacy code, and therefore a lot of regression testing). Of course if you have a brand new platform to play with, that is not an issue.

I could not find any other resource on that matter.
If someone has found another way, I would love to have some feedback 😁 .

@ngrue ngrue added the enhancement New feature or request label Apr 19, 2024
@langchain4j langchain4j added the P2 High priority label Apr 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request P2 High priority
Projects
None yet
Development

No branches or pull requests

2 participants