How to wait for user input¶
Prerequisites
This guide assumes familiarity with the following concepts:
Human-in-the-loop (HIL) interactions are crucial for agentic systems. Waiting for human input is a common HIL interaction pattern, allowing the agent to ask the user clarifying questions and await input before proceeding.
We can implement these in LangGraph using the interrupt()
function. interrupt
allows us to stop graph execution to collect input from a user and continue execution with collected input.
Setup¶
First we need to install the packages required
Next, we need to set API keys for Anthropic (the LLM we will use)
Optionally, we can set API key for LangSmith tracing, which will give us best-in-class observability.
export LANGCHAIN_TRACING_V2="true"
export LANGCHAIN_CALLBACKS_BACKGROUND="true"
export LANGCHAIN_API_KEY=your-api-key
Simple Usage¶
Let's explore a basic example of using human feedback. A straightforward approach is to create a node, human_feedback
, designed specifically to collect user input. This allows us to gather feedback at a specific, chosen point in our graph.
Steps:
- Call
interrupt()
inside thehuman_feedback
node. - We set up a checkpointer to save the state of the graph up until this node.
- Use
new Command({ resume: ... })
to provide the requested value to thehuman_feedback
node and resume execution.
import { StateGraph, Annotation, START, END, interrupt, MemorySaver } from "@langchain/langgraph";
const StateAnnotation = Annotation.Root({
input: Annotation<string>,
userFeedback: Annotation<string>
});
const step1 = (_state: typeof StateAnnotation.State) => {
console.log("---Step 1---");
return {};
}
const humanFeedback = (_state: typeof StateAnnotation.State) => {
console.log("--- humanFeedback ---");
const feedback: string = interrupt("Please provide feedback");
return {
userFeedback: feedback
};
}
const step3 = (_state: typeof StateAnnotation.State) => {
console.log("---Step 3---");
return {};
}
const builder = new StateGraph(StateAnnotation)
.addNode("step1", step1)
.addNode("humanFeedback", humanFeedback)
.addNode("step3", step3)
.addEdge(START, "step1")
.addEdge("step1", "humanFeedback")
.addEdge("humanFeedback", "step3")
.addEdge("step3", END);
// Set up memory
const memory = new MemorySaver()
// Add
const graph = builder.compile({
checkpointer: memory,
});
import * as tslab from "tslab";
const drawableGraph = graph.getGraph();
const image = await drawableGraph.drawMermaidPng();
const arrayBuffer = await image.arrayBuffer();
await tslab.display.png(new Uint8Array(arrayBuffer));
Run until our breakpoint at step2
// Input
const initialInput = { input: "hello world" };
// Thread
const config = { configurable: { thread_id: "1" }, streamMode: "values" as const };
// Run the graph until the first interruption
for await (const event of await graph.stream(initialInput, config)) {
console.log(`--- ${event.input} ---`);
}
// Will log when the graph is interrupted, after step 2.
console.log("--- GRAPH INTERRUPTED ---");
import { Command } from "@langchain/langgraph";
// Continue the graph execution
for await (const event of await graph.stream(
new Command({ resume: "go to step 3! "}),
config,
)) {
console.log(event);
console.log("\n====\n");
}
{ input: 'hello world' }
====
--- humanFeedback ---
{ input: 'hello world', userFeedback: 'go to step 3! ' }
====
---Step 3---
Agent¶
In the context of agents, waiting for user feedback is useful to ask clarifying questions.
To show this, we will build a relatively simple ReAct-style agent that does tool calling.
We will use OpenAI and / or Anthropic's models and a fake tool (just for demo purposes).
// Set up the tool
import { ChatAnthropic } from "@langchain/anthropic";
import { tool } from "@langchain/core/tools";
import { StateGraph, MessagesAnnotation, START, END, MemorySaver } from "@langchain/langgraph";
import { ToolNode } from "@langchain/langgraph/prebuilt";
import { AIMessage, ToolMessage } from "@langchain/core/messages";
import { z } from "zod";
const search = tool((_) => {
return "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.";
}, {
name: "search",
description: "Call to surf the web.",
schema: z.string(),
})
const tools = [search]
const toolNode = new ToolNode<typeof MessagesAnnotation.State>(tools)
// Set up the model
const model = new ChatAnthropic({ model: "claude-3-5-sonnet-20240620" })
const askHumanTool = tool((_) => {
return "The human said XYZ";
}, {
name: "askHuman",
description: "Ask the human for input.",
schema: z.string(),
});
const modelWithTools = model.bindTools([...tools, askHumanTool])
// Define nodes and conditional edges
// Define the function that determines whether to continue or not
function shouldContinue(state: typeof MessagesAnnotation.State): "action" | "askHuman" | typeof END {
const lastMessage = state.messages[state.messages.length - 1];
const castLastMessage = lastMessage as AIMessage;
// If there is no function call, then we finish
if (castLastMessage && !castLastMessage.tool_calls?.length) {
return END;
}
// If tool call is askHuman, we return that node
// You could also add logic here to let some system know that there's something that requires Human input
// For example, send a slack message, etc
if (castLastMessage.tool_calls?.[0]?.name === "askHuman") {
console.log("--- ASKING HUMAN ---")
return "askHuman";
}
// Otherwise if it isn't, we continue with the action node
return "action";
}
// Define the function that calls the model
async function callModel(state: typeof MessagesAnnotation.State): Promise<Partial<typeof MessagesAnnotation.State>> {
const messages = state.messages;
const response = await modelWithTools.invoke(messages);
// We return an object with a messages property, because this will get added to the existing list
return { messages: [response] };
}
// We define a fake node to ask the human
function askHuman(state: typeof MessagesAnnotation.State): Partial<typeof MessagesAnnotation.State> {
const toolCallId = (state.messages[state.messages.length - 1] as AIMessage).tool_calls[0].id;
const location: string = interrupt("Please provide your location:");
const newToolMessage = new ToolMessage({
tool_call_id: toolCallId,
content: location,
})
return { messages: [newToolMessage] };
}
// Define a new graph
const messagesWorkflow = new StateGraph(MessagesAnnotation)
// Define the two nodes we will cycle between
.addNode("agent", callModel)
.addNode("action", toolNode)
.addNode("askHuman", askHuman)
// We now add a conditional edge
.addConditionalEdges(
// First, we define the start node. We use `agent`.
// This means these are the edges taken after the `agent` node is called.
"agent",
// Next, we pass in the function that will determine which node is called next.
shouldContinue
)
// We now add a normal edge from `action` to `agent`.
// This means that after `action` is called, `agent` node is called next.
.addEdge("action", "agent")
// After we get back the human response, we go back to the agent
.addEdge("askHuman", "agent")
// Set the entrypoint as `agent`
// This means that this node is the first one called
.addEdge(START, "agent");
// Setup memory
const messagesMemory = new MemorySaver();
// Finally, we compile it!
// This compiles it into a LangChain Runnable,
// meaning you can use it as you would any other runnable
const messagesApp = messagesWorkflow.compile({
checkpointer: messagesMemory,
});
import * as tslab from "tslab";
const drawableGraph2 = messagesApp.getGraph();
const image2 = await drawableGraph2.drawMermaidPng();
const arrayBuffer2 = await image2.arrayBuffer();
await tslab.display.png(new Uint8Array(arrayBuffer2));
Interacting with the Agent¶
We can now interact with the agent. Let's ask it to ask the user where they are, then tell them the weather.
This should make it use the askHuman
tool first, then use the normal tool.
import { HumanMessage } from "@langchain/core/messages";
// Input
const inputs = new HumanMessage("Use the search tool to ask the user where they are, then look up the weather there");
// Thread
const config2 = { configurable: { thread_id: "3" }, streamMode: "values" as const };
for await (const event of await messagesApp.stream({
messages: [inputs]
}, config2)) {
const recentMsg = event.messages[event.messages.length - 1];
console.log(`================================ ${recentMsg._getType()} Message (1) =================================`)
console.log(recentMsg.content);
}
console.log("next: ", (await messagesApp.getState(config2)).next)
================================ human Message (1) =================================
Use the search tool to ask the user where they are, then look up the weather there
--- ASKING HUMAN ---
================================ ai Message (1) =================================
[
{
type: 'text',
text: "Certainly! I'll use the askHuman tool to ask the user about their location, and then I'll use the search tool to look up the weather for that location. Let's start by asking the user where they are."
},
{
type: 'tool_use',
id: 'toolu_01VG19hoPDUqsHuAk6Jp2L8J',
name: 'askHuman',
input: { input: 'Where are you currently located?' }
}
]
next: [ 'askHuman' ]
askHuman
node, which is now waiting for a location
to be provided. We can provide this value by invoking the graph with a new Command({ resume: "<location>" })
input:
import { Command } from "@langchain/langgraph";
// Continue the graph execution
for await (const event of await messagesApp.stream(
new Command({ resume: "San Francisco" }),
config2,
)) {
console.log(event);
console.log("\n====\n");
}
{
messages: [
HumanMessage {
"id": "ae6dcdd1-1671-46fd-9a58-ae7a6720a4fe",
"content": "Use the search tool to ask the user where they are, then look up the weather there",
"additional_kwargs": {},
"response_metadata": {}
},
AIMessage {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"content": [
{
"type": "text",
"text": "Certainly! I'll use the askHuman tool to ask the user about their location, and then I'll use the search tool to look up the weather for that location. Let's start by asking the user where they are."
},
{
"type": "tool_use",
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"name": "askHuman",
"input": {
"input": "Where are you currently located?"
}
}
],
"additional_kwargs": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
}
},
"response_metadata": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "askHuman",
"args": {
"input": "Where are you currently located?"
},
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"type": "tool_call"
}
],
"invalid_tool_calls": []
}
]
}
====
{
messages: [
HumanMessage {
"id": "ae6dcdd1-1671-46fd-9a58-ae7a6720a4fe",
"content": "Use the search tool to ask the user where they are, then look up the weather there",
"additional_kwargs": {},
"response_metadata": {}
},
AIMessage {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"content": [
{
"type": "text",
"text": "Certainly! I'll use the askHuman tool to ask the user about their location, and then I'll use the search tool to look up the weather for that location. Let's start by asking the user where they are."
},
{
"type": "tool_use",
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"name": "askHuman",
"input": {
"input": "Where are you currently located?"
}
}
],
"additional_kwargs": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
}
},
"response_metadata": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "askHuman",
"args": {
"input": "Where are you currently located?"
},
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"type": "tool_call"
}
],
"invalid_tool_calls": []
},
ToolMessage {
"id": "2c38a43c-7aef-4765-9ddb-5ff202a817f9",
"content": "San Francisco",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J"
}
]
}
====
{
messages: [
HumanMessage {
"id": "ae6dcdd1-1671-46fd-9a58-ae7a6720a4fe",
"content": "Use the search tool to ask the user where they are, then look up the weather there",
"additional_kwargs": {},
"response_metadata": {}
},
AIMessage {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"content": [
{
"type": "text",
"text": "Certainly! I'll use the askHuman tool to ask the user about their location, and then I'll use the search tool to look up the weather for that location. Let's start by asking the user where they are."
},
{
"type": "tool_use",
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"name": "askHuman",
"input": {
"input": "Where are you currently located?"
}
}
],
"additional_kwargs": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
}
},
"response_metadata": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "askHuman",
"args": {
"input": "Where are you currently located?"
},
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"type": "tool_call"
}
],
"invalid_tool_calls": []
},
ToolMessage {
"id": "2c38a43c-7aef-4765-9ddb-5ff202a817f9",
"content": "San Francisco",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J"
},
AIMessage {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"content": [
{
"type": "text",
"text": "Thank you for letting me know that you're in San Francisco. Now, I'll use the search tool to look up the weather in San Francisco for you."
},
{
"type": "tool_use",
"id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v",
"name": "search",
"input": {
"input": "current weather in San Francisco"
}
}
],
"additional_kwargs": {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 584,
"output_tokens": 89
}
},
"response_metadata": {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 584,
"output_tokens": 89
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "search",
"args": {
"input": "current weather in San Francisco"
},
"id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v",
"type": "tool_call"
}
],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 584,
"output_tokens": 89,
"total_tokens": 673
}
}
]
}
====
{
messages: [
HumanMessage {
"id": "ae6dcdd1-1671-46fd-9a58-ae7a6720a4fe",
"content": "Use the search tool to ask the user where they are, then look up the weather there",
"additional_kwargs": {},
"response_metadata": {}
},
AIMessage {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"content": [
{
"type": "text",
"text": "Certainly! I'll use the askHuman tool to ask the user about their location, and then I'll use the search tool to look up the weather for that location. Let's start by asking the user where they are."
},
{
"type": "tool_use",
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"name": "askHuman",
"input": {
"input": "Where are you currently located?"
}
}
],
"additional_kwargs": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
}
},
"response_metadata": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "askHuman",
"args": {
"input": "Where are you currently located?"
},
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"type": "tool_call"
}
],
"invalid_tool_calls": []
},
ToolMessage {
"id": "2c38a43c-7aef-4765-9ddb-5ff202a817f9",
"content": "San Francisco",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J"
},
AIMessage {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"content": [
{
"type": "text",
"text": "Thank you for letting me know that you're in San Francisco. Now, I'll use the search tool to look up the weather in San Francisco for you."
},
{
"type": "tool_use",
"id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v",
"name": "search",
"input": {
"input": "current weather in San Francisco"
}
}
],
"additional_kwargs": {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 584,
"output_tokens": 89
}
},
"response_metadata": {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 584,
"output_tokens": 89
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "search",
"args": {
"input": "current weather in San Francisco"
},
"id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v",
"type": "tool_call"
}
],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 584,
"output_tokens": 89,
"total_tokens": 673
}
},
ToolMessage {
"id": "149f5ca7-db60-425c-a903-2b163ca8480e",
"content": "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.",
"name": "search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v"
}
]
}
====
{
messages: [
HumanMessage {
"id": "ae6dcdd1-1671-46fd-9a58-ae7a6720a4fe",
"content": "Use the search tool to ask the user where they are, then look up the weather there",
"additional_kwargs": {},
"response_metadata": {}
},
AIMessage {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"content": [
{
"type": "text",
"text": "Certainly! I'll use the askHuman tool to ask the user about their location, and then I'll use the search tool to look up the weather for that location. Let's start by asking the user where they are."
},
{
"type": "tool_use",
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"name": "askHuman",
"input": {
"input": "Where are you currently located?"
}
}
],
"additional_kwargs": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
}
},
"response_metadata": {
"id": "msg_01BhcZosXt3R48jjrLJzuFmq",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 465,
"output_tokens": 106
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "askHuman",
"args": {
"input": "Where are you currently located?"
},
"id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J",
"type": "tool_call"
}
],
"invalid_tool_calls": []
},
ToolMessage {
"id": "2c38a43c-7aef-4765-9ddb-5ff202a817f9",
"content": "San Francisco",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "toolu_01VG19hoPDUqsHuAk6Jp2L8J"
},
AIMessage {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"content": [
{
"type": "text",
"text": "Thank you for letting me know that you're in San Francisco. Now, I'll use the search tool to look up the weather in San Francisco for you."
},
{
"type": "tool_use",
"id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v",
"name": "search",
"input": {
"input": "current weather in San Francisco"
}
}
],
"additional_kwargs": {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 584,
"output_tokens": 89
}
},
"response_metadata": {
"id": "msg_01R9CcAiHwyvcPLadSnjVe4R",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 584,
"output_tokens": 89
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "search",
"args": {
"input": "current weather in San Francisco"
},
"id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v",
"type": "tool_call"
}
],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 584,
"output_tokens": 89,
"total_tokens": 673
}
},
ToolMessage {
"id": "149f5ca7-db60-425c-a903-2b163ca8480e",
"content": "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.",
"name": "search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "toolu_018ZKbrzSuYYDgcUTFH6vj9v"
},
AIMessage {
"id": "msg_014wyeLFQQ23iwufeawSpCG6",
"content": "Based on the search results, it appears that the current weather in San Francisco is sunny. However, the search result also includes an unrelated comment about zodiac signs, which we can disregard for the purpose of providing weather information.\n\nTo summarize: The weather in San Francisco is currently sunny. It's a great day to be outdoors or enjoy the city's attractions. Remember to wear sunscreen and stay hydrated if you plan to spend time outside.\n\nIs there anything else you'd like to know about the weather in San Francisco or any other information you need?",
"additional_kwargs": {
"id": "msg_014wyeLFQQ23iwufeawSpCG6",
"type": "message",
"role": "assistant",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 706,
"output_tokens": 122
}
},
"response_metadata": {
"id": "msg_014wyeLFQQ23iwufeawSpCG6",
"model": "claude-3-5-sonnet-20240620",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 706,
"output_tokens": 122
},
"type": "message",
"role": "assistant"
},
"tool_calls": [],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 706,
"output_tokens": 122,
"total_tokens": 828
}
}
]
}
====