Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better LLM and LLMClient workflow #151

Open
SubatomicPlanets opened this issue May 14, 2024 · 1 comment · May be fixed by #164
Open

Better LLM and LLMClient workflow #151

SubatomicPlanets opened this issue May 14, 2024 · 1 comment · May be fixed by #164
Labels
enhancement New feature or request
Milestone

Comments

@SubatomicPlanets
Copy link

Describe the feature

As of now, you can chat by using only the LLM component. If you want to use multiple characters, you need to add LLMClient components. This makes sense, but I think it's a bit more intuitive if the LLM component does not inherit from LLMClient. Wouldn't it be better if the LLM component was just like a manager?

Basically, what I'm asking is to remove LLMClient functionality from the LLM component. You would then always need at least one LLMClient component to chat with the LLM. This does make it a tiny bit more complicated when setting up a simple chat with one character, but I think it is less confusing.

For a simple chat app, you would then need to add the LLM component and one LLMClient component (which can be on the same gameObject). For more complex games using multiple characters you also need one LLM component but also one LLMClient component per character. 

You could also rename the LLM component to LLMManager, but that is not so important, especially because it would break existing projects.

I just think that the LLM component should only control settings that are the same on all LLMClients, such as which model to use. The LLMClient then only has settings that can be different for each character.

Thanks!

@amakropoulos
Copy link
Collaborator

I completely agree!
This is work in progress for a major release coming up in the upcoming weeks :).
The LLM inheritance from LLMClient grew organically but it's indeed best separate,

@amakropoulos amakropoulos added this to the v2.0.0 milestone May 14, 2024
@amakropoulos amakropoulos linked a pull request Jun 13, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Todo
Development

Successfully merging a pull request may close this issue.

2 participants