Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port autogpt.core.resource.model_provider from AutoGPT to Forge #7001

Closed
Pwuts opened this issue Mar 11, 2024 · 3 comments · Fixed by #7106 · May be fixed by #7117
Closed

Port autogpt.core.resource.model_provider from AutoGPT to Forge #7001

Pwuts opened this issue Mar 11, 2024 · 3 comments · Fixed by #7106 · May be fixed by #7117
Assignees
Labels
architecture Topics related to package and system architecture Forge Roadmapped Issues that were spawned by roadmap items

Comments

@Pwuts
Copy link
Member

Pwuts commented Mar 11, 2024

Proposed new module name: forge.llm

Dependencies

TODO

  1. Port autogpt.core.resource.model_provider
  2. Make single interface for client initialization/usage
  3. Check module configuration setup (see below)

Notes

  • Configuration may need revision
    We want Forge components to be portable and usable as stand-alone imports. Modules should be able to configure themselves if no configuration is passed in.
    Example: OpenAI's constructor has an api_key parameter. If not set, it will try to read the API key from the OPENAI_API_KEY environment variable.

    Our OpenAIProvider wraps an OpenAI or AzureOpenAI client, depending on the configuration. We think it makes sense to preserve this behavior.

Why migrate this module?

The model_provider module provides functionality and extendability that is not available from any many-model client that we know of, e.g. LiteLLM. We would like to have support for as many models as possible, but:

  • As it is, AutoGPT's prompts are not portable between different model families. Until this is fixed, having access to any number of LLMs / LLM providers doesn't add much value.
  • We are eyeing some opportunities (developing LLM polyfills/middleware) for which having low-level access to the native clients is beneficial. Related: 🚀 AutoGPT Roadmap - Vendor Liberty 🗝️ #6969.

Because of these reasons, we want to keep our own client implementation for now.

Copy link

github-actions bot commented May 1, 2024

This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days.

@github-actions github-actions bot added the Stale label May 1, 2024
Copy link

This issue was closed automatically because it has been stale for 10 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 12, 2024
@ntindle
Copy link
Member

ntindle commented May 12, 2024

Unstale @kcze

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
architecture Topics related to package and system architecture Forge Roadmapped Issues that were spawned by roadmap items
Projects
None yet
3 participants