LangGraph — Stateful AI Agents Tutorial
Advertisement
Introduction
LangGraph enables building stateful agents with explicit control flow. Perfect for complex workflows requiring state management and conditional routing.
- Installation
- Basic Graph
- Conditional Routing
- Multi-Step Agent
- Persistent State
- With LLM Integration
- Conclusion
- FAQ
Installation
pip install langgraph langchain
Basic Graph
from langgraph.graph import StateGraph
from typing import TypedDict
class State(TypedDict):
question: str
answer: str
graph = StateGraph(State)
def process_question(state):
return {"answer": f"Answer to {state['question']}"}
graph.add_node("process", process_question)
graph.set_entry_point("process")
graph.set_finish_point("process")
compiled_graph = graph.compile()
result = compiled_graph.invoke({"question": "What is AI?"})
Conditional Routing
def should_research(state):
return "complex" if "complex" in state['question'].lower() else "simple"
graph = StateGraph(State)
graph.add_node("simple", lambda s: {"answer": "Simple answer"})
graph.add_node("complex", lambda s: {"answer": "Detailed answer"})
graph.set_entry_point("route")
graph.add_conditional_edges(
"route",
should_research,
{"simple": "simple", "complex": "complex"}
)
Multi-Step Agent
class AgentState(TypedDict):
question: str
research: str
answer: str
graph = StateGraph(AgentState)
def research(state):
return {"research": "research findings"}
def generate_answer(state):
return {"answer": f"Answer: {state['research']}"}
graph.add_node("research", research)
graph.add_node("answer", generate_answer)
graph.set_entry_point("research")
graph.add_edge("research", "answer")
graph.set_finish_point("answer")
Persistent State
from langgraph.checkpoint.sqlite import SqliteSaver
memory = SqliteSaver(":memory:")
compiled = graph.compile(checkpointer=memory)
# Execution with state persistence
config = {"configurable": {"thread_id": "user_123"}}
result = compiled.invoke({"question": "Q1"}, config)
result = compiled.invoke({"question": "Q2"}, config)
With LLM Integration
from langchain_openai import ChatOpenAI
llm = ChatOpenAI()
def agent_step(state):
response = llm.invoke(state["question"])
return {"answer": response.content}
graph = StateGraph(State)
graph.add_node("agent", agent_step)
Conclusion
LangGraph provides powerful state management for complex agent workflows. Essential for production multi-step applications.
FAQ
Q: When to use LangGraph vs simple chains? A: LangGraph for stateful, complex workflows; chains for simple sequences.
Q: Can LangGraph handle branching logic? A: Yes, conditional edges provide complex routing.
Q: Is LangGraph production-ready? A: Yes, actively maintained and production-tested.
Advertisement