Skip to main content
Open In ColabOpen on GitHub

SurrealDBVectorStore

SurrealDB is a unified, multi-model database purpose-built for AI systems. It combines structured and unstructured data (including vector search, graph traversal, relational queries, full-text search, document storage, and time-series data) into a single ACID-compliant engine, scaling from a 3 MB edge binary to petabyte-scale clusters in the cloud. By eliminating the need for multiple specialized stores, SurrealDB simplifies architectures, reduces latency, and ensures consistency for AI workloads.

Why SurrealDB Matters for GenAI Systems

  • One engine for storage and memory: Combine durable storage and fast, agent-friendly memory in a single system, providing all the data your agent needs and removing the need to sync multiple systems.
  • One-hop memory for agents: Run vector search, graph traversal, semantic joins, and transactional writes in a single query, giving LLM agents fast, consistent memory access without stitching relational, graph and vector databases together.
  • In-place inference and real-time updates: SurrealDB enables agents to run inference next to data and receive millisecond-fresh updates, critical for real-time reasoning and collaboration.
  • Versioned, durable context: SurrealDB supports time-travel queries and versioned records, letting agents audit or “replay” past states for consistent, explainable reasoning.
  • Plug-and-play agent memory: Expose AI memory as a native concept, making it easy to use SurrealDB as a drop-in backend for AI frameworks.

This notebook covers how to get started with the SurrealDB vector store.

Setup

You can run SurrealDB locally or start with a free SurrealDB cloud account.

For local, two options:

  1. Install SurrealDB and run SurrealDB. Run in-memory with:

    surreal start -u root -p root
  2. Run with Docker.

    docker run --rm --pull always -p 8000:8000 surrealdb/surrealdb:latest start

Install dependencies

Install langchain-surrealdb and surrealdb python packages.

# -- Using pip
pip install --upgrade langchain-surrealdb surrealdb
# -- Using poetry
poetry add langchain-surrealdb surrealdb
# -- Using uv
uv add --upgrade langchain-surrealdb surrealdb

To run this notebook, we just need to install the additional dependencies required by this example:

!poetry add --quiet --group docs langchain-ollama langchain-surrealdb

Initialization

from langchain_ollama import OllamaEmbeddings
from langchain_surrealdb.vectorstores import SurrealDBVectorStore
from surrealdb import Surreal

conn = Surreal("ws://localhost:8000/rpc")
conn.signin({"username": "root", "password": "root"})
conn.use("langchain", "demo")
vector_store = SurrealDBVectorStore(OllamaEmbeddings(model="llama3.2"), conn)
API Reference:OllamaEmbeddings

Manage vector store

Add items to vector store

from langchain_core.documents import Document

_url = "https://surrealdb.com"
d1 = Document(page_content="foo", metadata={"source": _url})
d2 = Document(page_content="SurrealDB", metadata={"source": _url})
d3 = Document(page_content="bar", metadata={"source": _url})
d4 = Document(page_content="this is surreal", metadata={"source": _url})

vector_store.add_documents(documents=[d1, d2, d3, d4], ids=["1", "2", "3", "4"])
API Reference:Document
['1', '2', '3', '4']

Update items in vector store

updated_document = Document(
page_content="zar", metadata={"source": "https://example.com"}
)

vector_store.add_documents(documents=[updated_document], ids=["3"])
['3']

Delete items from vector store

vector_store.delete(ids=["3"])

Query vector store

Once your vector store has been created and the relevant documents have been added you will most likely wish to query it during the running of your chain or agent.

Query directly

Performing a simple similarity search can be done as follows:

results = vector_store.similarity_search(
query="surreal", k=1, custom_filter={"source": "https://surrealdb.com"}
)
for doc in results:
print(f"{doc.page_content} [{doc.metadata}]") # noqa: T201
this is surreal [{'source': 'https://surrealdb.com'}]

If you want to execute a similarity search and receive the corresponding scores you can run:

results = vector_store.similarity_search_with_score(
query="thud", k=1, custom_filter={"source": "https://surrealdb.com"}
)
for doc, score in results:
print(f"[similarity={score:.0%}] {doc.page_content}") # noqa: T201
[similarity=57%] this is surreal

Query by turning into retriever

You can also transform the vector store into a retriever for easier usage in your chains.

retriever = vector_store.as_retriever(
search_type="mmr", search_kwargs={"k": 1, "lambda_mult": 0.5}
)
retriever.invoke("surreal")
[Document(id='4', metadata={'source': 'https://surrealdb.com'}, page_content='this is surreal')]

Usage for retrieval-augmented generation

For guides on how to use this vector store for retrieval-augmented generation (RAG), see the following sections:

API reference

For detailed documentation of all SurrealDBVectorStore features and configurations head to the API reference: https://python.langchain.com/api_reference/surrealdb/index.html

Next steps


Was this page helpful?