If youβre just starting with LangGraph and want to build real AI workflows using Google Gemini, this beginner-friendly tutorial will help you understand everything step-by-step.

In this langgraph tutorial, we will build a simple multi-node LangGraph project where one node decides the route and the graph moves to different nodes using conditional edges.
β
3+ nodes
β
Conditional Routing (If/Else Flow)
β
Real-world scenario workflow
- β What Youβll Build in This Tutorial
- β What is LangGraph (Simple Explanation)
- β Install Required Packages
- β Setup Gemini API Key
- β Step 1: Import Required Modules
- β Step 2: Define the Shared State
- β Step 3: Initialize Gemini Model
- β Step 4: Create Nodes (Router + 3 Processing Nodes)
- β Step 5: Create Conditional Routing Function
- β Step 6: Build the LangGraph Workflow
- β Step 7: Test Your LangGraph Workflow
- β Expected Output Flow
- β What You Learned (Beginner Summary)
- Source Code
β What Youβll Build in This Tutorial
Weβll create an AI Customer Support Workflow.
User queries can be of 3 types:
| User Query Type | Example | Route |
|---|---|---|
| Refund related | βI want a refundβ | Refund Node |
| Technical issue | βApp is crashingβ | Tech Node |
| General question | βWhat is pricing?β | General Node |
So your graph behaves like:
Router Node β Refund Node β END Router Node β Tech Node β END Router Node β General Node β END
β What is LangGraph (Simple Explanation)
LangGraph is a framework that helps you design workflows as a graph, where:
β Node
A node is a step/function in your AI workflow.
Example nodes:
- Detect intent
- Generate response
- Validate output
- Summarize content
β Edge
An edge is the connection between steps.
β Conditional Edge
A conditional edge means:
βGo to the next node based on the output.β
Example:
- If intent =
refundβ go to refund node - If intent =
techβ go to tech node - Else β go to general node
β Install Required Packages
Run this in your terminal:
pip install langgraph langchain langchain-google-genai
β Setup Gemini API Key
Linux / Mac:
export GOOGLE_API_KEY="your_api_key_here"
Windows PowerShell:
setx GOOGLE_API_KEY "your_api_key_here"
β Step 1: Import Required Modules
from typing import TypedDict, Optional from langgraph.graph import StateGraph, END from langchain_google_genai import ChatGoogleGenerativeAI
β Step 2: Define the Shared State
LangGraph nodes communicate using a shared state.
class SupportState(TypedDict):
user_query: str
intent: Optional[str]
response: Optional[str]
π₯ Why do we need State?
Because every node needs access to:
β
user input
β
router output (intent)
β
final response
β Step 3: Initialize Gemini Model
llm = ChatGoogleGenerativeAI(
model="gemini-2.5-flash",
temperature=0.2
)
β
gemini-2.5-flash is fast and great for beginners.
Note: Model availability may change, so please use a currently available model.
β Step 4: Create Nodes (Router + 3 Processing Nodes)
β Node 1: Router Node (Intent Detector)
This node decides what the user wants.
def router_node(state: SupportState):
query = state["user_query"]
prompt = f"""
You are an intent classifier.
Classify the user query into one of these intents only:
- refund
- tech
- general
User Query: {query}
Return only one word: refund OR tech OR general.
"""
intent = llm.invoke(prompt).content.strip().lower()
return {"intent": intent}
β Output of this node updates state like:
{
"user_query": "I want refund",
"intent": "refund",
"response": null
}
β Node 2: Refund Support Node
def refund_node(state: SupportState):
query = state["user_query"]
prompt = f"""
You are a refund support assistant.
User says: {query}
Reply politely and explain the refund process in 3-5 lines.
"""
answer = llm.invoke(prompt).content
return {"response": answer}
β Node 3: Technical Support Node
def tech_node(state: SupportState):
query = state["user_query"]
prompt = f"""
You are a technical support assistant.
User says: {query}
Give troubleshooting steps in bullet points.
"""
answer = llm.invoke(prompt).content
return {"response": answer}
β Node 4: General Support Node
def general_node(state: SupportState):
query = state["user_query"]
prompt = f"""
You are a customer support assistant.
User says: {query}
Answer clearly and friendly.
"""
answer = llm.invoke(prompt).content
return {"response": answer}
β Step 5: Create Conditional Routing Function
This function decides which node should run next.
def route_by_intent(state: SupportState):
intent = state["intent"]
if intent == "refund":
return "refund_node"
elif intent == "tech":
return "tech_node"
else:
return "general_node"
β This is the heart of conditional edges.
β Step 6: Build the LangGraph Workflow
graph = StateGraph(SupportState)
# Add nodes
graph.add_node("router_node", router_node)
graph.add_node("refund_node", refund_node)
graph.add_node("tech_node", tech_node)
graph.add_node("general_node", general_node)
# Start from router
graph.set_entry_point("router_node")
# Conditional edges from router node
graph.add_conditional_edges(
"router_node",
route_by_intent,
{
"refund_node": "refund_node",
"tech_node": "tech_node",
"general_node": "general_node",
}
)
# End workflow after final nodes
graph.add_edge("refund_node", END)
graph.add_edge("tech_node", END)
graph.add_edge("general_node", END)
app = graph.compile()
β Step 7: Test Your LangGraph Workflow
queries = [
"I want a refund for my purchase",
"My app is crashing after update",
"Can you explain your pricing plans?"
]
for q in queries:
result = app.invoke({"user_query": q, "intent": None, "response": None})
print("\nUSER:", q)
print("INTENT:", result["intent"])
print("RESPONSE:\n", result["response"])
β Expected Output Flow
β Input: βI want a refundβ
Flow:
router_node β refund_node β END
β Input: βApp is crashingβ
Flow:
router_node β tech_node β END
β Input: βPricing plans?β
Flow:
router_node β general_node β END
β What You Learned (Beginner Summary)
By completing this LangGraph beginner tutorial, you now know:
β
How to create a LangGraph workflow
β
How to connect multiple nodes
β
How conditional edges work
β
How to use Gemini in each node
β
How routing creates smart workflows
Source Code
Please visit here to get source code LangGraph Tutorial with Gemini for Beginner .ipynb
See Also

Code is for execution, not just conversation. I focus on building software that is as efficient as it is logical. At Ganforcode, I deconstruct complex stacks into clean, scalable solutions for developers who care about stability. While others ship bugs, I document the path to 100% uptime and zero-error logic
1 thought on “LangGraph Tutorial with Gemini for Beginner π”