{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Use in web environments\n", "\n", "LangGraph.js uses the [`async_hooks`](https://nodejs.org/api/async_hooks.html) API to more conveniently allow\n", "for tracing and callback propagation within nodes. This API is supported in many environments, such as\n", "[Node.js](https://nodejs.org/api/async_hooks.html),\n", "[Deno](https://deno.land/std@0.177.0/node/internal/async_hooks.ts),\n", "[Cloudflare Workers](https://developers.cloudflare.com/workers/runtime-apis/nodejs/asynclocalstorage/),\n", "and the [Edge runtime](https://vercel.com/docs/functions/runtimes/edge-runtime#compatible-node.js-modules), but not all,\n", "such as within web browsers.\n", "\n", "To allow usage of LangGraph.js in environments that do not have the `async_hooks` API available, we've added a separate\n", "`@langchain/langgraph/web` entrypoint. This entrypoint exports everything that the primary `@langchain/langgraph` exports,\n", "but will not initialize or even import `async_hooks`. Here's a simple example:" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Hello from the browser!\n" ] } ], "source": [ "// Import from \"@langchain/langgraph/web\"\n", "import {\n", " END,\n", " START,\n", " StateGraph,\n", " StateGraphArgs\n", "} from \"@langchain/langgraph/web\";\n", "import { HumanMessage } from \"@langchain/core/messages\";\n", "\n", "// Define the state interface\n", "interface AgentState {\n", " messages: HumanMessage[];\n", "}\n", "\n", "// Define the graph state\n", "const graphState: StateGraphArgs[\"channels\"] = {\n", " messages: {\n", " value: (x: HumanMessage[], y: HumanMessage[]) => x.concat(y),\n", " default: () => [],\n", " },\n", "};\n", "\n", "const nodeFn = async (_state: AgentState) => {\n", " return { messages: [new HumanMessage(\"Hello from the browser!\")] }\n", "}\n", "\n", "// Define a new graph\n", "const workflow = new StateGraph({ channels: graphState })\n", " .addNode(\"node\", nodeFn)\n", " .addEdge(START, \"node\")\n", " .addEdge(\"node\", END);\n", "\n", "const app = workflow.compile({});\n", "\n", "// Use the Runnable\n", "const finalState = await app.invoke(\n", " { messages: [] },\n", ");\n", "\n", "console.log(finalState.messages[finalState.messages.length - 1].content);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Other entrypoints, such as `@langchain/langgraph/prebuilt`, can be used in either environment.\n", "\n", "
\n", "

Caution

\n", "

\n", " If you are using LangGraph.js on the frontend, make sure you are not exposing any private keys!\n", " For chat models, this means you need to use something like WebLLM\n", " that can run client-side without authentication.\n", "

\n", "
\n", "\n", "## Passing config\n", "\n", "The lack of `async_hooks` support in web browsers means that if you are calling a\n", "[`Runnable`](https://js.langchain.com/v0.2/docs/concepts#interface) within a node (for example, when calling a chat model),\n", "you need to manually pass a `config` object through to properly support tracing,\n", "[`.streamEvents()`](https://js.langchain.com/v0.2/docs/how_to/streaming#using-stream-events) to stream intermediate steps,\n", "and other callback related functionality. This config object will passed in as the second argument of each node, and should be\n", "used as the second parameter of any `Runnable` method.\n", "\n", "To illustrate this, let's set up our graph again as before, but with a `Runnable` within our node. First, we'll\n", "avoid passing `config` through into the nested function, then try to use `.streamEvents()` to see the intermediate results of the\n", "nested function:" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Received 0 events from the nested function\n" ] } ], "source": [ "// Import from \"@langchain/langgraph/web\"\n", "import {\n", " END,\n", " START,\n", " StateGraph,\n", " StateGraphArgs\n", "} from \"@langchain/langgraph/web\";\n", "import { HumanMessage } from \"@langchain/core/messages\";\n", "import {\n", " RunnableLambda,\n", "} from \"@langchain/core/runnables\";\n", "import { type StreamEvent } from \"@langchain/core/tracers/log_stream\";\n", "\n", "// Define the state interface\n", "interface AgentState {\n", " messages: HumanMessage[];\n", "}\n", "\n", "// Define the graph state\n", "const graphState: StateGraphArgs[\"channels\"] = {\n", " messages: {\n", " value: (x: HumanMessage[], y: HumanMessage[]) => x.concat(y),\n", " default: () => [],\n", " },\n", "};\n", "\n", "const nodeFn = async (_state: AgentState) => {\n", " // Note that we do not pass any `config` through here\n", " const nestedFn = RunnableLambda.from(async (input: string) => {\n", " return new HumanMessage(`Hello from ${input}!`)\n", " }).withConfig({ runName: \"nested\" });\n", " const responseMessage = await nestedFn.invoke(\"a nested function\");\n", " return { messages: [responseMessage] }\n", "}\n", "\n", "// Define a new graph\n", "const workflow = new StateGraph({ channels: graphState })\n", " .addNode(\"node\", nodeFn)\n", " .addEdge(START, \"node\")\n", " .addEdge(\"node\", END);\n", "\n", "const app = workflow.compile({});\n", "\n", "// Stream intermediate steps from the graph\n", "const eventStream = await app.streamEvents(\n", " { messages: [] },\n", " { version: \"v2\" },\n", " { includeNames: [\"nested\"] }\n", ");\n", "\n", "const events: StreamEvent[] = [];\n", "for await (const event of eventStream) {\n", " console.log(event);\n", " events.push(event);\n", "}\n", "\n", "console.log(`Received ${events.length} events from the nested function`);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can see that we get no events.\n", "\n", "Next, let's try redeclaring the graph with a node that passes config through correctly:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{\n", " event: \"on_chain_start\",\n", " data: { input: { messages: [] } },\n", " name: \"nested\",\n", " tags: [],\n", " run_id: \"9a7c5a55-f9f1-4058-8c58-7be43078468c\",\n", " metadata: {}\n", "}\n", "{\n", " event: \"on_chain_end\",\n", " data: {\n", " output: HumanMessage {\n", " lc_serializable: true,\n", " lc_kwargs: {\n", " content: \"Hello from a nested function!\",\n", " additional_kwargs: {},\n", " response_metadata: {}\n", " },\n", " lc_namespace: [ \"langchain_core\", \"messages\" ],\n", " content: \"Hello from a nested function!\",\n", " name: undefined,\n", " additional_kwargs: {},\n", " response_metadata: {}\n", " }\n", " },\n", " run_id: \"9a7c5a55-f9f1-4058-8c58-7be43078468c\",\n", " name: \"nested\",\n", " tags: [],\n", " metadata: {}\n", "}\n", "Received 2 events from the nested function\n" ] } ], "source": [ "// Import from \"@langchain/langgraph/web\"\n", "import {\n", " END,\n", " START,\n", " StateGraph,\n", " StateGraphArgs\n", "} from \"@langchain/langgraph/web\";\n", "import { HumanMessage } from \"@langchain/core/messages\";\n", "import {\n", " RunnableLambda,\n", " type RunnableConfig\n", "} from \"@langchain/core/runnables\";\n", "import { type StreamEvent } from \"@langchain/core/tracers/log_stream\";\n", "\n", "// Define the state interface\n", "interface AgentState {\n", " messages: HumanMessage[];\n", "}\n", "\n", "// Define the graph state\n", "const graphState: StateGraphArgs[\"channels\"] = {\n", " messages: {\n", " value: (x: HumanMessage[], y: HumanMessage[]) => x.concat(y),\n", " default: () => [],\n", " },\n", "};\n", "\n", "// Note the second argument here.\n", "const nodeFn = async (_state: AgentState, config?: RunnableConfig) => {\n", " // If you need to nest deeper, remember to pass `_config` when invoking\n", " const nestedFn = RunnableLambda.from(async (input: string, _config?: RunnableConfig) => {\n", " return new HumanMessage(`Hello from ${input}!`)\n", " }).withConfig({ runName: \"nested\" });\n", " const responseMessage = await nestedFn.invoke(\"a nested function\", config);\n", " return { messages: [responseMessage] }\n", "}\n", "\n", "// Define a new graph\n", "const workflow = new StateGraph({ channels: graphState })\n", " .addNode(\"node\", nodeFn)\n", " .addEdge(START, \"node\")\n", " .addEdge(\"node\", END);\n", "\n", "const app = workflow.compile({});\n", "\n", "// Stream intermediate steps from the graph\n", "const eventStream = await app.streamEvents(\n", " { messages: [] },\n", " { version: \"v2\" },\n", " { includeNames: [\"nested\"] }\n", ");\n", "\n", "const events: StreamEvent[] = [];\n", "for await (const event of eventStream) {\n", " console.log(event);\n", " events.push(event);\n", "}\n", "\n", "console.log(`Received ${events.length} events from the nested function`);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can see that we get events from the nested function as expected.\n", "\n", "## Next steps\n", "\n", "You've now learned about some special considerations around using LangGraph.js in web environments.\n", "\n", "Next, check out [some how-to guides on core functionality](https://langchain-ai.github.io/langgraphjs/how-tos/#core)." ] } ], "metadata": { "kernelspec": { "display_name": "Deno", "language": "typescript", "name": "deno" }, "language_info": { "file_extension": ".ts", "mimetype": "text/x.typescript", "name": "typescript", "nb_converter": "script", "pygments_lexer": "typescript", "version": "5.3.3" } }, "nbformat": 4, "nbformat_minor": 2 }