A powerful integration between Cognee and LangGraph that provides intelligent knowledge management and retrieval capabilities for AI agents.
Note: This package requires Python 3.10+ and uses async tools. All agents must use
await agent.ainvoke()instead ofagent.invoke()
cognee-integration-langgraph combines Cognee's advanced knowledge storage and retrieval system with LangGraph's workflow orchestration capabilities. This integration allows you to build AI agents that can efficiently store, search, and retrieve information from a persistent knowledge base.
- Smart Knowledge Storage: Add and persist information using Cognee's advanced indexing
- Semantic Search: Retrieve relevant information using natural language queries
- Session Management: Support for user-specific data isolation
- LangGraph Integration: Seamless integration with LangGraph's agent framework
- Async Support: Built with async/await for high-performance applications
# Basic installation
pip install cognee-integration-langgraph
# With guide dependencies (needed for examples/guide.ipynb)
pip install cognee-integration-langgraph[guide]The [guide] extra includes additional dependencies (mediawikiapi, wikibase-rest-api-client) needed for the WikiData functionality demonstrated in the guide notebook.
import asyncio
from langchain.agents import create_agent
from langchain_core.messages import HumanMessage
from cognee_integration_langgraph import get_sessionized_cognee_tools
import cognee
async def main():
# Get sessionized tools with a custom session ID
add_tool, search_tool = get_sessionized_cognee_tools("user-123")
# Or get regular tools without sessionization (auto-generates a session ID)
# add_tool, search_tool = get_sessionized_cognee_tools()
# Create an agent with memory capabilities
agent = create_agent(
"openai:gpt-4o-mini",
tools=[add_tool, search_tool],
)
# Use the agent (note: must use await with .ainvoke())
response = await agent.ainvoke({
"messages": [
HumanMessage(content="Remember: I like pizza and coding in Python")
]
})
print(response["messages"][-1].content)
if __name__ == "__main__":
asyncio.run(main())Returns cognee tools with optional user-specific sessionization.
Parameters:
session_id(optional): User identifier for data isolation. If not provided, a random session ID is auto-generated.
Returns: (add_tool, search_tool) - A tuple of tools for storing and searching data
Usage:
# With sessionization (recommended for multi-user apps)
add_tool, search_tool = get_sessionized_cognee_tools("user-123")
# Without explicit session (auto-generates session ID)
add_tool, search_tool = get_sessionized_cognee_tools()add_tool: Store information in the knowledge basesearch_tool: Search and retrieve previously stored information
cognee-integration-langgraph supports user-specific sessions to isolate data between different users or contexts:
import asyncio
from cognee_integration_langgraph import get_sessionized_cognee_tools
from langchain.agents import create_agent
async def main():
# Each user gets their own isolated session
user1_add, user1_search = get_sessionized_cognee_tools("user-123")
user2_add, user2_search = get_sessionized_cognee_tools("user-456")
# Create separate agents for each user
agent1 = create_agent("openai:gpt-4o-mini", tools=[user1_add, user1_search])
agent2 = create_agent("openai:gpt-4o-mini", tools=[user2_add, user2_search])
# Each agent works with isolated data
await agent1.ainvoke({"messages": [...]})
await agent2.ainvoke({"messages": [...]})Copy the .env.template file to .env and fill out the required API keys:
cp .env.template .envThen edit the .env file and set both keys using your OpenAI API key:
OPENAI_API_KEY=your-openai-api-key-here
LLM_API_KEY=your-openai-api-key-hereCheck out the examples/ directory for more comprehensive usage examples:
examples/example.py: Complete workflow with contract managementexamples/guide.ipynb: Jupyter notebook tutorial with step-by-step guidance
- Python 3.10+
- OpenAI API key
- Dependencies automatically managed via pyproject.toml