Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support local LLMs #35

Open
miurla opened this issue May 16, 2023 · 4 comments
Open

Support local LLMs #35

miurla opened this issue May 16, 2023 · 4 comments
Labels
enhancement New feature or request

Comments

@miurla
Copy link
Owner

miurla commented May 16, 2023

Adding Local LLMs as a Model Option.

React framework: https://github.com/r2d4/react-llm

@miurla miurla added the enhancement New feature or request label May 16, 2023
@sprinteroz
Copy link

Hi very interesting just wondering if are you planning to build a UI for the hooks, so it can be integrated into Babyagi-ui?

@miurla
Copy link
Owner Author

miurla commented Jul 23, 2023

I plan to start working on OpenSource model support from the API.
#139

What kind of integration do you want?
 I'd like to know more details.

@SEVENID
Copy link

SEVENID commented Aug 2, 2023

How about https://github.com/oobabooga/text-generation-webui api support?

@miurla
Copy link
Owner Author

miurla commented Aug 3, 2023

@SEVENID
text-generation-webui seems easy to connect via API, which is nice. 👍
However, the need for the user to install it’s a bit of a drawback.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants