Skip to content

v2.11.2: Support local Gemma, Mistral, Llama model via Ollama

Compare
Choose a tag to compare
@fred-bf fred-bf released this 26 Feb 10:18
· 226 commits to main since this release
bc1794f

Important

If you want to use local Gemma, Mistral, Llama, and other models through NextChat, please refer to this document for setup: https://docs.nextchat.dev/models/ollama

What's Changed

New Contributors

Full Changelog: v2.10.3...v2.11.2