Models¶
This page describes how to configure the chat model used by an agent.
Tool calling support¶
To enable tool-calling agents, the underlying LLM must support tool calling.
Compatible models can be found in the LangChain integrations directory.
Using initChatModel
¶
The initChatModel
utility simplifies model initialization with configurable parameters:
import { initChatModel } from "langchain/chat_models/universal";
const llm = await initChatModel(
"anthropic:claude-3-7-sonnet-latest",
{
temperature: 0,
maxTokens: 2048
}
);
Refer to the API reference for advanced options.
Using provider-specific LLMs¶
If a model provider is not available via initChatModel
, you can instantiate the provider's model class directly. The model must implement the BaseChatModel
interface and support tool calling:
import { ChatAnthropic } from "@langchain/anthropic";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
const llm = new ChatAnthropic({
modelName: "claude-3-7-sonnet-latest",
temperature: 0,
maxTokens: 2048
});
const agent = createReactAgent({
llm,
// other parameters
});
Illustrative example
The example above uses ChatAnthropic
, which is already supported by initChatModel
. This pattern is shown to illustrate how to manually instantiate a model not available through initChatModel
.