希望可以本地翻译文件
#1662
Replies: 1 comment
-
Ollama 是一个适配了 OpenAI 接口的本地模型运行方案,可以接入 Ollama 来翻译: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
新功能 | New Feature
希望插件可以在本地批量翻译文件(调用本地LLM),而不是必须联网
为什么您想要添加这些功能 | Why you want to add these features
有些敏感文件不能做线上翻译,希望可以开放本地翻译。
提议 | Proposal
No response
Beta Was this translation helpful? Give feedback.
All reactions