Are you an LLM? Read llms.txt for a summary of the docs, or llms-full.txt for the full context.
Skip to content

LangChain Integration

Connect your LocusGraph wisdom graph to LangChain agents with LocusGraphMemory and LocusGraphRetriever.

Installation

LocusGraphMemory

LocusGraphMemory gives LangChain chains access to your wisdom graph as conversational context. It stores conversation events automatically and retrieves relevant knowledge on each turn.

TypeScript

import { LocusGraphClient, LocusGraphMemory } from '@locusgraph/client';
import { ConversationChain } from 'langchain/chains';
import { ChatOpenAI } from '@langchain/openai';
 
const client = new LocusGraphClient({
  agentSecret: process.env.LOCUSGRAPH_AGENT_SECRET,
});
 
const memory = new LocusGraphMemory(
  client,
  'default',     // graphId
  'my-agent',    // agentId
  'session-123', // sessionId
);
 
const chain = new ConversationChain({
  llm: new ChatOpenAI(),
  memory,
});
 
const response = await chain.call({ input: 'What do you know about my preferences?' });

Python

from locusgraph_client import LocusGraphClient, LocusGraphMemory
from langchain.chains import ConversationChain
from langchain_openai import ChatOpenAI
 
client = LocusGraphClient(agent_secret="your-secret")
 
memory = LocusGraphMemory(
    client,
    "default",              # graph_id
    agent_id="my-agent",
    session_id="session-123",
)
 
chain = ConversationChain(llm=ChatOpenAI(), memory=memory)
response = chain.invoke({"input": "What do you know about my preferences?"})

Memory Keys

LocusGraphMemory exposes three keys to your chain's prompt:

KeyDescription
historyRecent conversation turns from the current session
memoriesRelevant knowledge retrieved from the wisdom graph
memory_infoMetadata about retrieved contexts (scores, types)

Automatic Event Classification

When LocusGraphMemory stores conversation events, it classifies them automatically:

ClassificationTrigger
factUser states preferences, personal details, or factual information
actionUser requests a task or the agent performs one
decisionUser makes a choice between alternatives
feedbackUser expresses opinions or satisfaction
observationDefault for all other conversational content

LocusGraphRetriever

LocusGraphRetriever implements LangChain's retriever interface, letting you plug your wisdom graph into any retrieval chain.

TypeScript

import { LocusGraphClient, LocusGraphRetriever } from '@locusgraph/client';
import { RetrievalQAChain } from 'langchain/chains';
import { ChatOpenAI } from '@langchain/openai';
 
const client = new LocusGraphClient({
  agentSecret: process.env.LOCUSGRAPH_AGENT_SECRET,
});
 
const retriever = new LocusGraphRetriever({
  client,
  graphId: 'default',
  limit: 10,
});
 
const chain = RetrievalQAChain.fromLLM(new ChatOpenAI(), retriever);
const response = await chain.call({ query: 'Summarize user preferences' });

Python

from locusgraph_client import LocusGraphClient, LocusGraphRetriever
from langchain.chains import RetrievalQA
from langchain_openai import ChatOpenAI
 
client = LocusGraphClient(agent_secret="your-secret")
 
retriever = LocusGraphRetriever(
    client=client,
    graph_id="default",
    limit=10,
)
 
chain = RetrievalQA.from_llm(llm=ChatOpenAI(), retriever=retriever)
response = chain.invoke({"query": "Summarize user preferences"})

LocusGraphRetriever returns LangChain Document objects. Each document's page_content holds the context content, and metadata includes context_id, context_type, and score.

Next

TypeScript SDK
Full TypeScript SDK reference.
Python SDK
Full Python SDK reference.