Plugin that creates a ChromaDB vector database to work with LM Studio running in server mode!
-
Updated
May 5, 2024 - Python
Plugin that creates a ChromaDB vector database to work with LM Studio running in server mode!
visionOS examples ⸺ Spatial Computing Accelerators for Apple Vision Pro
LLMX; Easy Ollama UI for the web! First with 3rd party UI Lm Studio support!
This repository hosts a web-based chat application using LM Studio's AI models like Mistral, OpenAI, and Llama through a Gradio interface. It maintains conversation history for a continuous, coherent chat experience akin to ChatGPT or Claude.
Serverless single HTML page access to an OpenAI API compatible Local LLM
automate the batching and execution of prompts.
Solve complex problems Intelligently orchestrate subagents using Local LLM, Embeddings,duckduckgo search
Add a description, image, and links to the lm-studio topic page so that developers can more easily learn about it.
To associate your repository with the lm-studio topic, visit your repo's landing page and select "manage topics."