Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add support Ollama backend & bump golang to 1.22 #1065

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

yankay
Copy link
Contributor

@yankay yankay commented Apr 14, 2024

Closes #1064

📑 Description

The Ollama can make it easier for users to interact with K8SGPT.
Support Ollama backend with Ollama API.
Because Ollama API requires golang v1.22 , so upgrade the golang to v1.22

Usage:

# ./bin/k8sgpt auth add -b ollama  -m llama3 -u http://localhost:11434
ollama added to the AI backend provider list
# ./bin/k8sgpt analyze --explain -b ollama 
AI Provider: ollama

0: Service haha/dao-2048()
- Error: Service has not ready endpoints, pods: [Pod/dao-2048-69696bf664-kd997], expected 1
Error: Service has not ready endpoints, pods: [Pod/dao-2048-69696bf664-kd997], expected 1.

Solution:
1. Check the pod's status using `kubectl get pod <pod_name> -o yaml`.
2. Verify if the container is running and its logs are showing any errors.
3. If the container is not running, try restarting it with `kubectl exec <pod_name> -- restart`.
4. If the issue persists, check the service's configuration to ensure it's correctly pointing to the pod's port.

Next TODO things:

✅ Checks

  • My pull request adheres to the code style of this project
  • My code requires changes to the documentation
  • I have updated the documentation as required
  • All the tests have passed

ℹ Additional Information

@yankay yankay requested review from a team as code owners April 14, 2024 06:19
@yankay yankay changed the title feat: add Ollama backend feat: add support Ollama backend Apr 14, 2024
@arbreezy
Copy link
Member

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

@yankay
Copy link
Contributor Author

yankay commented Apr 16, 2024

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

HI @arbreezy

Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .

So, it needs to be implemented as 2 Providers.

ref:
LocalAI : https://localai.io/
Ollama: https://github.com/ollama/ollama

How do you think about that :-)

@arbreezy
Copy link
Member

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

HI @arbreezy

Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .

So, it needs to be implemented as 2 Providers.

ref: LocalAI : https://localai.io/ Ollama: https://github.com/ollama/ollama

How do you think about that :-)

Azure OpenAI is slightly different but I get your argument

I don't have a strong opinion on adding another file for Ollama identical with localai; Ideally we would have a generic 'local' backend which support the OpenAI's APIs

any thoughts on that @AlexsJones @matthisholleville ?

@yankay
Copy link
Contributor Author

yankay commented Apr 22, 2024

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

HI @arbreezy
Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .
So, it needs to be implemented as 2 Providers.
ref: LocalAI : https://localai.io/ Ollama: https://github.com/ollama/ollama
How do you think about that :-)

Azure OpenAI is slightly different but I get your argument

I don't have a strong opinion on adding another file for Ollama identical with localai; Ideally we would have a generic 'local' backend which support the OpenAI's APIs

any thoughts on that @AlexsJones @matthisholleville ?

Thanks @arbreezy

Ollama has an official go client. https://github.com/ollama/ollama/blob/main/api/client.go
If maintainers agree, I can change the code to use it, :-)

@arbreezy
Copy link
Member

If maintainers agree, I can change the code to use it, :-)

@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?

@AlexsJones
Copy link
Member

If maintainers agree, I can change the code to use it, :-)

@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?

I agree, thanks

@yankay yankay force-pushed the ollama branch 2 times, most recently from 9a64da0 to 1d4ee29 Compare May 6, 2024 10:51
@yankay yankay changed the title feat: add support Ollama backend feat: add support Ollama backend & bump golang to 1.22 May 6, 2024
@yankay
Copy link
Contributor Author

yankay commented May 6, 2024

If maintainers agree, I can change the code to use it, :-)

@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?

I agree, thanks

Thanks @AlexsJones @arbreezy

It has been changed to use the Ollama official go client. https://github.com/ollama/ollama/blob/main/api/client.go

Would you please help to review it? :-)

req := &ollama.GenerateRequest{
Model: c.model,
Prompt: prompt,
Stream: new(bool),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where is that bool defined?

Copy link
Contributor Author

@yankay yankay May 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

HI @AlexsJones

The bool is defined at https://github.com/ollama/ollama/blob/main/api/types.go#L62 ,
Because the default value of the stream is true, it needs to be set as false.
Then resp.Response can return as a string.

How do you think about that:-)

@yankay yankay force-pushed the ollama branch 2 times, most recently from 7070887 to 1931f6b Compare May 17, 2024 08:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Proposed
Development

Successfully merging this pull request may close these issues.

[Feature]: support Ollama backend
3 participants