{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# How to stream LLM tokens (without LangChain models)\n", "\n", "In this guide, we will stream tokens from the language model powering an agent without using LangChain chat models. We'll be using the OpenAI client library directly in a ReAct agent as an example.\n", "\n", "## Setup\n", "\n", "To get started, install the `openai` and `langgraph` packages separately:\n", "\n", "```bash\n", "$ npm install openai @langchain/langgraph @langchain/core\n", "```\n", "\n", "
Compatibility
\n", "\n",
" This guide requires @langchain/core>=0.2.19
, and if you are using LangSmith, langsmith>=0.1.39
. For help upgrading, see this guide.\n",
"