Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

Please update go module github.com/chewxy/math32 to the last bug Something isn't working
#4297 opened May 9, 2024 by HougeLangley
longtext llama3-gradient bug bug Something isn't working
#4293 opened May 9, 2024 by bambooqj
bug: extra zero being added to Context Length and Max Tokens bug Something isn't working
#4288 opened May 9, 2024 by edwardochoaphd
Ollama crate cannot generate modelfile with non-English name bug Something isn't working
#4285 opened May 9, 2024 by BarcodeQH
Ollama v0.1.34 Timeout issue on Codellama34B bug Something isn't working
#4283 opened May 9, 2024 by humza-sami
ollama can support Huawei Ascend NPU? feature request New feature or request
#4282 opened May 9, 2024 by lonngxiang
Get Entropy feature request New feature or request
#4281 opened May 9, 2024 by antonbugaets
Error: pull model manifest: file does not exist bug Something isn't working
#4280 opened May 9, 2024 by taozhiyuai
Ollama reports an error when running the AI model using GPU bug Something isn't working
#4279 opened May 9, 2024 by xiaomo0925
bge-m3 model request Model requests
#4276 opened May 9, 2024 by Mimicvat
Update command for Linux version feature request New feature or request
#4274 opened May 9, 2024 by Maplerxyz
API useing bug Something isn't working
#4273 opened May 9, 2024 by w1757876747
Partial pruning does not wrok bug Something isn't working
#4271 opened May 9, 2024 by jmorganca
windows ollama 0.1.34 can not use GPU,with nvidia RTX 4060 bug Something isn't working
#4270 opened May 9, 2024 by zhafree
403 using zrok documentation Improvements or additions to documentation
#4262 opened May 8, 2024 by quantumalchemy
Error: could not connect to ollama app, is it running? amd Issues relating to AMD GPUs and ROCm bug Something isn't working windows
#4260 opened May 8, 2024 by starMagic
stop loading model while i close my computer. bug Something isn't working
#4259 opened May 8, 2024 by chaserstrong
Support for InternVL-Chat-V1.5 model request Model requests
#4257 opened May 8, 2024 by wwjCMP
The ollama model how resides on the gpu? feature request New feature or request needs more info More information is needed to assist
#4254 opened May 8, 2024 by lonngxiang
ProTip! Mix and match filters to narrow down what you’re looking for.