New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
希望能支持本地llama.cpp接口 #1601
Labels
enhancement
New update or improvement request
Comments
沉浸式翻译的 openai 可以自定义模型和 API,可以满足需求吗? |
可以的,但是目前好像只支持ollam的
Owen ***@***.***> 于2024年5月10日周五 22:44写道:
… 沉浸式翻译的 openai 可以自定义模型和 API,可以满足需求吗?
—
Reply to this email directly, view it on GitHub
<#1601 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABCQPVG5GPJN5BB7CUOSWO3ZBTMOPAVCNFSM6AAAAABFWW2GHOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMBUG4ZTQMJTGU>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
com>
--
Best Regards
enzofang
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
您希望的更新和改进是什么 | Update or Improve
希望提供llama.cpp接口支持,llama.cpp后端支持vulkan,理论上比ollama适配的硬件更广,效率更高
可以类似chatbox
补充说明 | Additional context
No response
The text was updated successfully, but these errors were encountered: