GraphRAG SDK¶
The most accurate Graph RAG framework. Built on FalkorDB.
GraphRAG SDK builds knowledge graphs from documents and answers questions over them using graph-based retrieval-augmented generation. Every pipeline step is a swappable strategy behind an abstract interface.
Key Highlights¶
- #1 on GraphRAG-Bench Novel — 63.73 ACC on 2,010 questions (benchmark)
- Simple API --
ingest()+completion()with sensible defaults - 100+ LLM providers via LiteLLM (OpenAI, Azure, Anthropic, Cohere, Ollama, and more)
- Fully modular -- swap chunking, extraction, resolution, retrieval, and reranking strategies
- Production-ready -- async-first, connection pooling, circuit breaker, batched writes
- Full provenance -- every answer traces back to its source document and chunk
Quick Start¶
import asyncio
from graphrag_sdk import GraphRAG, ConnectionConfig, LiteLLM, LiteLLMEmbedder
async def main():
async with GraphRAG(
connection=ConnectionConfig(host="localhost", graph_name="my_graph"),
llm=LiteLLM(model="openai/gpt-4o"),
embedder=LiteLLMEmbedder(model="openai/text-embedding-3-small"),
) as rag:
await rag.ingest("my_document.pdf")
await rag.finalize()
answer = await rag.completion("What is the main topic?")
print(answer.answer)
asyncio.run(main())
Next Steps¶
- Getting Started -- Full tutorial from install to first query
- Architecture -- How the 9-step pipeline works
- Strategies -- All swappable strategy ABCs and built-in options
- Benchmark -- Methodology and reproduction instructions
- API Reference -- Full API documentation