haive.agents.memory.agent¶

Memory Agent - ReactAgent with persistent memory, KG extraction, and auto-summarization.

Phase 1: Memory tools (save/search/KG) + auto-context + auto-summarize Phase 2: KG extraction from conversations + context-length pre-hook

Optional integration with document_modifiers (summarizer, KG extraction)

Uses haive.core store for persistence, NOT langmem. Supports InMemoryStore (dev) and PostgresStore (production).

Classes¶

ConversationExtraction

Structured extraction from a conversation turn.

ExtractedTriple

A single knowledge graph triple extracted from conversation.

MemoryAgent

ReactAgent with persistent memory, KG extraction, and auto-summarization.

Functions¶

create_memory_agent([name, store, connection_string, ...])

Factory for creating a memory agent with store + memory tools pre-wired.

Module Contents¶

class haive.agents.memory.agent.ConversationExtraction(/, **data)¶

Bases: pydantic.BaseModel

Structured extraction from a conversation turn.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

class haive.agents.memory.agent.ExtractedTriple(/, **data)¶

Bases: pydantic.BaseModel

A single knowledge graph triple extracted from conversation.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

class haive.agents.memory.agent.MemoryAgent(/, **data)¶

Bases: haive.agents.react.agent.ReactAgent

ReactAgent with persistent memory, KG extraction, and auto-summarization.

Features: 1. Memory tools bound to a store (save/search memories + KG triples) 2. Auto-context: searches store for relevant memories before each response 3. Auto-summarize: when context length exceeds threshold, summarize and store 4. KG extraction: extracts knowledge triples from conversations (post-response) 5. Integration points for document_modifiers (summarizer, KG pipelines)

Store options: - Pass store= directly (any LangGraph BaseStore) - Pass connection_string= for PostgreSQL (preferred for production) - Default: InMemoryStore (dev only)

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

connect_neo4j(config=None)¶

Connect to Neo4j for graph-based KG queries.

Parameters:

config (Any) – Neo4jKGConfig instance, or None for env var defaults

Returns:

Neo4jKGStore instance

Return type:

Any

create_graph_query_tool()¶

Create a LangChain tool that queries the Neo4j KG with natural language.

Uses GraphDBRAGAgent internally to generate Cypher from questions. Requires Neo4j connection (connect_neo4j() must be called first).

Return type:

Any

extract_kg_from_document(text, allowed_nodes=None)¶

Extract KG triples from a document using GraphTransformer.

Uses haive.agents.document_modifiers.kg for full document-level knowledge graph extraction (more thorough than conversation extraction).

Parameters:
  • text (str) – Document text to extract from

  • allowed_nodes (list[str] | None) – Optional list of entity types to extract

Returns:

List of extracted triples as dicts

Return type:

list[dict]

get_store()¶

Get the underlying store for direct access.

Return type:

Any

query_kg(entity)¶

Query KG triples for an entity from Neo4j.

Parameters:

entity (str) – Entity name to look up

Returns:

List of {subject, predicate, object} dicts

Return type:

list[dict]

query_kg_cypher(cypher, params=None)¶

Execute a raw Cypher query against the Neo4j KG.

Parameters:
  • cypher (str) – Cypher query string

  • params (dict | None) – Query parameters

Returns:

List of result records

Return type:

list[dict]

run(input_data=None, debug=None, **kwargs)¶

Run with memory: load context -> respond -> extract KG -> maybe summarize.

Flow: 1. Pre-hook: Load memory context (memories + KG triples + summaries) 2. Execute: ReactAgent responds (may call memory tools) 3. Post-hook: Extract KG triples from conversation 4. Post-hook: Auto-summarize if context length exceeds threshold

Parameters:
Return type:

Any

sync_kg_to_neo4j()¶

Sync all KG triples from store to Neo4j.

Creates the KG in Neo4j from existing triples in the LangGraph store. Must call connect_neo4j() first.

Returns:

Number of triples synced

Return type:

int

model_config¶

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

haive.agents.memory.agent.create_memory_agent(name='memory_agent', store=None, connection_string=None, extra_tools=None, user_id='default', auto_extract_kg=True, summarize_threshold=4000, neo4j_config=None, **kwargs)¶

Factory for creating a memory agent with store + memory tools pre-wired.

This is the recommended way to create a MemoryAgent. It resolves the store, creates memory tools, and passes them into the engine so ReactAgent’s tool routing works correctly.

Parameters:
  • name (str) – Agent name

  • store (Any) – Direct store instance (takes precedence over connection_string)

  • connection_string (str | None) – PostgreSQL connection string for production

  • extra_tools (list | None) – Additional tools beyond memory tools

  • user_id (str) – User ID for memory scoping

  • auto_extract_kg (bool) – Enable automatic KG triple extraction from conversations

  • summarize_threshold (int) – Token count threshold for auto-summarization

  • neo4j_config (Any) – Neo4jKGConfig instance, True for env var defaults, or None to skip

  • **kwargs – Additional MemoryAgent/ReactAgent kwargs

Return type:

MemoryAgent