Embedding_dict is empty despite include_embedding=True #13141
Replies: 1 comment
-
It appears that the issue with the The Regarding scalable storage solutions for embeddings, the repository is exploring more efficient methods, including the integration of vector stores like DeepLake. This approach aims to provide a more scalable and efficient way to manage embeddings, especially for large knowledge graphs. In summary, while the exact cause of the
|
Beta Was this translation helpful? Give feedback.
-
Hello, thank you in advance!
I am building a KnowledgeGraph using NebulaGraphStore from scratch by upserting triplets using the function
upsert_triplet_and_node
and set theinclude_embedding=True
. Despite this, during query time I always get the warning "Index was not constructed with embeddings, skipping embedding usage...". I looked through the codebase and found this warning is due tolen(self._index_struct.embedding_dict) == 0
in the KGTableRetriever. I tried to print the following and did get an empty embedding_dict as a result:Is this behavior expected or a bug? I assume the embeddings are already added to the embedding_dict when I upsert triplets.
Here are the two ways I generate my query_engine and both show this warning. I do not see any change in results when setting the embedding flag to be True or False. Is the embedding_dict actually used anywhere to generate context information when answering the query?
Way 1:
Way 2:
In addition, I am curious to know when a vector_store will be supported to store the embeddings for a knowledge graph. I feel that an embedding_dict is not scalable for large knowledge graphs.
@logan-markewich @wey-gu Thanks so much for your time! I really, really appreciate it!
Beta Was this translation helpful? Give feedback.
All reactions