Skip to content

Models

This page describes how to configure the chat model used by an agent.

Tool calling support

To enable tool-calling agents, the underlying LLM must support tool calling.

Compatible models can be found in the LangChain integrations directory.

Specifying a model by name

You can configure an agent with a model name string:

API Reference: create_react_agent

from langgraph.prebuilt import create_react_agent

agent = create_react_agent(
    model="anthropic:claude-3-7-sonnet-latest",
    # other parameters
)

Using init_chat_model

The init_chat_model utility simplifies model initialization with configurable parameters:

API Reference: init_chat_model

from langchain.chat_models import init_chat_model

model = init_chat_model(
    "anthropic:claude-3-7-sonnet-latest",
    temperature=0,
    max_tokens=2048
)

Refer to the API reference for advanced options.

Using provider-specific LLMs

If a model provider is not available via init_chat_model, you can instantiate the provider's model class directly. The model must implement the BaseChatModel interface and support tool calling:

API Reference: ChatAnthropic | create_react_agent

from langchain_anthropic import ChatAnthropic
from langgraph.prebuilt import create_react_agent

model = ChatAnthropic(
    model="claude-3-7-sonnet-latest",
    temperature=0,
    max_tokens=2048
)

agent = create_react_agent(
    model=model,
    # other parameters
)

Illustrative example

The example above uses ChatAnthropic, which is already supported by init_chat_model. This pattern is shown to illustrate how to manually instantiate a model not available through init_chat_model.

Additional resources

Comments