Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Relyt as a Vector Store #13075

Merged
merged 4 commits into from May 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/docs/community/integrations/vector_stores.md
Expand Up @@ -35,6 +35,7 @@ as the storage backend for `VectorStoreIndex`.
- Qdrant (`QdrantVectorStore`) [Installation](https://qdrant.tech/documentation/install/) [Python Client](https://qdrant.tech/documentation/install/#python-client)
- LanceDB (`LanceDBVectorStore`) [Installation/Quickstart](https://lancedb.github.io/lancedb/basic/)
- Redis (`RedisVectorStore`). [Installation](https://redis.io/docs/latest/operate/oss_and_stack/install/install-stack/).
- Relyt (`RelytVectorStore`). [Quickstart](https://docs.relyt.cn/docs/vector-engine/).
- Supabase (`SupabaseVectorStore`). [Quickstart](https://supabase.github.io/vecs/api/).
- TiDB (`TiDBVectorStore`). [Quickstart](../../examples/vector_stores/TiDBVector.ipynb). [Installation](https://tidb.cloud/ai). [Python Client](https://github.com/pingcap/tidb-vector-python).
- TimeScale (`TimescaleVectorStore`). [Installation](https://github.com/timescale/python-vector).
Expand Down
273 changes: 273 additions & 0 deletions docs/docs/examples/vector_stores/RelytDemo.ipynb
@@ -0,0 +1,273 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "307804a3-c02b-4a57-ac0d-172c30ddc851",
"metadata": {},
"source": [
"# Relyt\n",
"\n",
"<a href=\"https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/docs/examples/vector_stores/PGVectoRsDemo.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "36be66bf",
"metadata": {},
"source": [
"Firstly, you will probably need to install dependencies :"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a094740d",
"metadata": {},
"outputs": [],
"source": [
"%pip install llama-index-vector-stores-relyt"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6807106d",
"metadata": {},
"outputs": [],
"source": [
"%pip install llama-index \"pgvecto_rs[sdk]\""
]
},
{
"cell_type": "markdown",
"id": "6e9642d8-d3aa-49f0-b8e4-4612a716e21f",
"metadata": {},
"source": [
"Then start the relyt as the [official document](https://docs.relyt.cn/docs/vector-engine/use/):"
]
},
{
"cell_type": "markdown",
"id": "a6fe902c-3b17-427c-b039-2d77c597c6c1",
"metadata": {},
"source": [
"Setup the logger."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d48af8e1",
"metadata": {},
"outputs": [],
"source": [
"import logging\n",
"import os\n",
"import sys\n",
"\n",
"logging.basicConfig(stream=sys.stdout, level=logging.INFO)\n",
"logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "f7010b1d-d1bb-4f08-9309-a328bb4ea396",
"metadata": {},
"source": [
"#### Creating a pgvecto_rs client"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0ce3143d-198c-4dd2-8e5a-c5cdf94f017a",
"metadata": {},
"outputs": [],
"source": [
"from pgvecto_rs.sdk import PGVectoRs\n",
"\n",
"URL = \"postgresql+psycopg://{username}:{password}@{host}:{port}/{db_name}\".format(\n",
" port=os.getenv(\"RELYT_PORT\", \"5432\"),\n",
" host=os.getenv(\"RELYT_HOST\", \"localhost\"),\n",
" username=os.getenv(\"RELYT_USER\", \"postgres\"),\n",
" password=os.getenv(\"RELYT_PASS\", \"mysecretpassword\"),\n",
" db_name=os.getenv(\"RELYT_NAME\", \"postgres\"),\n",
")\n",
"\n",
"client = PGVectoRs(\n",
" db_url=URL,\n",
" collection_name=\"example\",\n",
" dimension=1536, # Using OpenAI’s text-embedding-ada-002\n",
")"
]
},
{
"cell_type": "markdown",
"id": "c3d7ac82-0ba6-4a32-8dad-3234e42b660a",
"metadata": {},
"source": [
"#### Setup OpenAI"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4ad14111-0bbb-4c62-906d-6d6253e0cdee",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = \"sk-...\""
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "8ee4473a-094f-4d0a-a825-e1213db07240",
"metadata": {},
"source": [
"#### Load documents, build the PGVectoRsStore and VectorStoreIndex"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0a2bcc07",
"metadata": {},
"outputs": [],
"source": [
"from IPython.display import Markdown, display\n",
"\n",
"from llama_index.core import SimpleDirectoryReader, VectorStoreIndex\n",
"from llama_index.vector_stores.relyt import RelytVectorStore"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "7d782f76",
"metadata": {},
"source": [
"Download Data"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5104674e",
"metadata": {},
"outputs": [],
"source": [
"!mkdir -p 'data/paul_graham/'\n",
"!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "68cbd239-880e-41a3-98d8-dbb3fab55431",
"metadata": {},
"outputs": [],
"source": [
"# load documents\n",
"documents = SimpleDirectoryReader(\"./data/paul_graham\").load_data()"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ba1558b3",
"metadata": {},
"outputs": [],
"source": [
"# initialize without metadata filter\n",
"from llama_index.core import StorageContext\n",
"\n",
"vector_store = RelytVectorStore(client=client)\n",
"storage_context = StorageContext.from_defaults(vector_store=vector_store)\n",
"index = VectorStoreIndex.from_documents(\n",
" documents, storage_context=storage_context\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "04304299-fc3e-40a0-8600-f50c3292767e",
"metadata": {},
"source": [
"#### Query Index"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "35369eda",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/embeddings \"HTTP/1.1 200 OK\"\n",
"HTTP Request: POST https://api.openai.com/v1/embeddings \"HTTP/1.1 200 OK\"\n",
"INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n",
"HTTP Request: POST https://api.openai.com/v1/chat/completions \"HTTP/1.1 200 OK\"\n"
]
}
],
"source": [
"# set Logging to DEBUG for more detailed outputs\n",
"query_engine = index.as_query_engine()\n",
"response = query_engine.query(\"What did the author do growing up?\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "bedbb693-725f-478f-be26-fa7180ea38b2",
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"<b>The author, growing up, worked on writing and programming. They wrote short stories and also tried writing programs on an IBM 1401 computer. They later got a microcomputer and started programming more extensively, writing simple games and a word processor.</b>"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"display(Markdown(f\"<b>{response}</b>\"))"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
1 change: 1 addition & 0 deletions llama-index-cli/llama_index/cli/upgrade/mappings.json
Expand Up @@ -422,6 +422,7 @@
"ChatGPTRetrievalPluginClient": "llama_index.vector_stores.chatgpt_plugin",
"TairVectorStore": "llama_index.vector_stores.tair",
"RedisVectorStore": "llama_index.vector_stores.redis",
"RelytVectorStore": "llama_index.vector_stores.relyt",
"set_google_config": "llama_index.vector_stores.google",
"GoogleVectorStore": "llama_index.vector_stores.google",
"MetalVectorStore": "llama_index.vector_stores.metal",
Expand Down
Expand Up @@ -422,6 +422,7 @@
"ChatGPTRetrievalPluginClient": "llama_index.vector_stores.chatgpt_plugin",
"TairVectorStore": "llama_index.vector_stores.tair",
"RedisVectorStore": "llama_index.vector_stores.redis",
"RelytVectorStore": "llama_index.vector_stores.relyt",
"set_google_config": "llama_index.vector_stores.google",
"GoogleVectorStore": "llama_index.vector_stores.google",
"MetalVectorStore": "llama_index.vector_stores.metal",
Expand Down