Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEAT] Define model for Embedding #276

Open
netandreus opened this issue Nov 14, 2023 · 1 comment
Open

[FEAT] Define model for Embedding #276

netandreus opened this issue Nov 14, 2023 · 1 comment

Comments

@netandreus
Copy link

Problem

I am using LocalAI with Zep.

llm:
  service: "openai"
  model: "gpt-3.5-turbo-1106"
  openai_endpoint: "http://host.docker.internal:8080/v1"

I can define model for llm itself, but It's needed also to define model for embeddings, because It seems that now model is hardcoded to text-embedding-ada-002.

Possible solution

Add and use model key in embeddings options like this:

    embeddings:
      enabled: true
      chunk_size: 200
      dimensions: 384
      service: "openai"
      model: "some-custom-model"
@danielchalef
Copy link
Member

We're refactoring how LLMs work and separating generation/completion from embeddings, which will address the above. We'll be releasing this in the new year.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants