Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: speaker_selection_agent does not get registered in GroupChat if the selector is a custom model client #2643

Open
Basekill opened this issue May 10, 2024 · 4 comments · May be fixed by #2696
Labels
bug Something isn't working group chat group-chat-related issues help wanted Extra attention is needed

Comments

@Basekill
Copy link

Basekill commented May 10, 2024

Describe the bug

When creating a group chat with speaker_selection_method="auto" and the group chat manager registered as a custom model client, the speaker_selection_agent gets the custom llm_config, but is not registered with the custom model client.
Thus we get a model client not activated error when trying to use the speaker_selection_agent at

response = llm_client.create(

Here the speaker_selection_agent is not registered:

speaker_selection_agent = ConversableAgent(

This also occurs in other places where an agent is created for you.
For example, in Teachability, self.analyzer is created for you without registration of custom model.

Steps to reproduce

  1. Create a GroupChat with speaker_selection_method="auto"
  2. Create a GroupChatManager with a custom model client as llm_config
  3. Register the GroupChatManager with the custom model client
  4. Initiate a chat with the GroupChatManager
  5. See model client not activated error

Model Used

Custom model

Expected Behavior

speaker_selection_agent should be registered with the same model client as the selector

Screenshots and logs

image
Custom LLM client does not get registered here

Additional Information

No response

@Basekill Basekill added the bug Something isn't working label May 10, 2024
@ekzhu ekzhu added the group chat group-chat-related issues label May 10, 2024
@ekzhu
Copy link
Collaborator

ekzhu commented May 10, 2024

Thanks for pointing this out. What do you think about exposing the select_speaker_agent either through constructor or an attribute, so we can better customize the llm_config and its client.

cc @WaelKarkoub @marklysze

@Basekill
Copy link
Author

Although exposing select_speaker_agent would solve this problem in this case, it is a general issue with agents being initialised for you in methods (internal helper agents that are not created by the developer but in the library).

For example, the self.analyzer created for Teachability (

self.analyzer = TextAnalyzerAgent(llm_config=self.llm_config)
)

Is not registered to a model client for custom models.

I'm sure there are other places where an agent is used internally but is not registered.

One solution could be to store a list of registered custom model client classes, then when an internal agent is used, it would register all the model client classes in that list.

If we were to expose the internal agents, this would have to be done for all internal agents - which might be revealing too much about internal implementation detail if the implementation were to be changed in the future. Although it would add more customizability as it means the internal agents could have a different client to the outer agent so it is a trade-off.

@ekzhu
Copy link
Collaborator

ekzhu commented May 13, 2024

One solution could be to store a list of registered custom model client classes, then when an internal agent is used, it would register all the model client classes in that list.

This is a good idea. We can start by simply passing the client of the parent agent to the internal agent, starting from the group chat.

@ekzhu ekzhu added the help wanted Extra attention is needed label May 13, 2024
@Matteo-Frattaroli
Copy link
Collaborator

Matteo-Frattaroli commented May 15, 2024

Same problem here.

I agree with @Basekill 's statement:

it is a general issue with agents being initialised for you in methods (internal helper agents that are not created by the developer but in the library).

@ekzhu maybe some kind of custom_model detection strategy could be defined, perhaps detecting the presence of 'model_client_cls' in the custom model's llm_config (even though I'm not sure if this is used by oai/azure oai models too). In this way it would be possible to detect a custom model that needs registration and call register_model_client on any agent instance created within the library.

If 'model_client_cls' is already used by oai/azure oai perhaps a new key that distinguishes custom models could be required in the config.

Matteo-Frattaroli pushed a commit to Matteo-Frattaroli/autogen that referenced this issue May 15, 2024
Matteo-Frattaroli pushed a commit to Matteo-Frattaroli/autogen that referenced this issue May 15, 2024
@Matteo-Frattaroli Matteo-Frattaroli linked a pull request May 15, 2024 that will close this issue
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working group chat group-chat-related issues help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants