Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request] Add MLC Web-LLM #4394

Closed
pacoccino opened this issue Feb 13, 2024 · 9 comments · Fixed by #5243
Closed

[Feature request] Add MLC Web-LLM #4394

pacoccino opened this issue Feb 13, 2024 · 9 comments · Fixed by #5243
Labels
auto:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features help wanted This would make a good PR

Comments

@pacoccino
Copy link

Could be great to add support for MLC-AI Web-LLM

This allows to execute LLMs in the browser via WASM/WebGPU

https://github.com/mlc-ai/web-llm

Copy link

dosubot bot commented Feb 13, 2024

🤖

Hey @pacoccino! Great to see you back with another interesting idea. How's everything going?

Thank you for your feature request. Adding support for MLC-AI's Web-LLM to the LangChainJS framework could indeed enhance the functionality of applications developed with LangChain. However, integrating this feature would require significant changes to the current architecture of the LangChainJS codebase.

Currently, the execution of language models in the LangChainJS framework is handled by the BaseLanguageModel class, which provides several methods for executing language models, including generatePrompt, predict, and predictMessages. These methods are abstract and their actual implementation is in the subclasses of BaseLanguageModel.

To add support for Web-LLM, we would need to create a new subclass of BaseLanguageModel that implements these methods using the Web-LLM API. This would involve writing code to load the Web-LLM WASM module, initialize it with the appropriate language model, and then use it to execute the language model on the input text.

In addition, we would need to modify the Chains module to use this new subclass when the user wants to execute a language model in the browser. This would likely involve adding a new option to the Chains module's configuration that specifies whether to use the Web-LLM or the current method for executing language models.

Finally, we would need to thoroughly test this new feature to ensure that it works correctly and does not introduce any new bugs into the LangChainJS framework.

Unfortunately, as a technical support representative, I am unable to implement this feature myself. However, I would recommend creating a new issue on the LangChainJS GitHub repository detailing your feature request. This will allow the LangChainJS developers to evaluate your request and potentially implement it in a future update.

Please note that while I strive to provide accurate and up-to-date information, there may be a better solution that I was not able to find. If you have any further questions or need more information, please don't hesitate to ask.

Here is the link to the LangChainJS GitHub repository where you can create a new issue: LangChainJS GitHub

I hope this information is helpful. Thank you for your interest in the LangChainJS framework.

Best regards,
Dosu

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added the auto:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features label Feb 13, 2024
@jacoblee93
Copy link
Collaborator

Yes this would be incredible!

@jacoblee93 jacoblee93 added the help wanted This would make a good PR label Feb 14, 2024
@pacoccino
Copy link
Author

pacoccino commented Feb 14, 2024

I don't know much the codebase of langchain, does the suggestions of the bot are correct ?

Does this would "require significant changes to the current architecture" as the bot says, or it may be easy implementing this, as for HF's transformers.js ?

I can give it a try it there's not big architectures changes involved

@Adrastopoulos
Copy link
Contributor

@pacoccino Have you made significant effort towards this? I'm willing to pick this up.

@jacoblee93
Copy link
Collaborator

I ended up adapting his PR on another repo here:

https://github.com/jacoblee93/fully-local-pdf-chatbot/blob/main/app/lib/chat_models/webllm.ts

Still a few rough edges but if you're down to refine it we'd welcome the addition!

@Adrastopoulos
Copy link
Contributor

Awesome. I'll take it on.

@stvnshpd
Copy link

Hey! I tried running latest implementation that @jacoblee93 put together, but when compiling I am getting an import error for 'Engine' from the "@mlc-ai/web-llm" library from what looks like the webllm.ts file in the langchain-community library. The webllm.ts file appears to use 'webllm.Engine()' in initialization, yet when I looked into the "@mlc-ai/web-llm" library I notice they export a 'MLCEngine' class instead of 'Engine'. Is this a known issue/discrepancy with a resolution?

@jacoblee93
Copy link
Collaborator

I think there was a breaking change in a recent WebLLM version a little while ago - what version did you install?

@stvnshpd
Copy link

For the "@mlc-ai/web-llm" library, I tried a sweep from 0.2.38 thru 0.2.30, and several below - including some 0.1.x releases- but didn't find any change in the error. (I did the sweep for both "@langchain/community" 0.2.0 and 0.2.1 versions)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features help wanted This would make a good PR
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants