Adding Memory to LangGraph¶
Below is an example of how to incorporate LangMem in LangGraph (MessageGraph, in particular).
The checkpointer handles fetching and posting the messages to LangMem, meaning you just have to query for the user's memories and format them in whichever form is best for your chat bot.
In [ ]:
Copied!
%%capture --no-stderr
%pip install -U langgraph langchain_anthropic
%pip install -U langmem
%%capture --no-stderr
%pip install -U langgraph langchain_anthropic
%pip install -U langmem
In [1]:
Copied!
# Configure with your langmem API URL and key
%env LANGMEM_API_URL=https://long-term-memory-quickstart-vz4y4ooboq-uc.a.run.app
%env LANGMEM_API_KEY=<YOUR API KEY>
# Optional (for tracing)
%env LANGCHAIN_API_KEY=<YOUR API KEY>
%env LANGCHAIN_TRACING_V2
%env LANGCHAIN_ENDPOINT=https://api.smith.langchain.com
# Configure with your langmem API URL and key
%env LANGMEM_API_URL=https://long-term-memory-quickstart-vz4y4ooboq-uc.a.run.app
%env LANGMEM_API_KEY=
# Optional (for tracing)
%env LANGCHAIN_API_KEY=
%env LANGCHAIN_TRACING_V2
%env LANGCHAIN_ENDPOINT=https://api.smith.langchain.com
env: LANGMEM_API_URL=http://localhost:8100
Chat Bot¶
For this example will use a MessagesCheckpoint
, which automatically handles fetching and storing the graph state in LangGraph. Memory creation is then handled by the wrapped integration. The only thing we need to define is the function for querying and representing the memory for our bot.
In [2]:
Copied!
import uuid
from langchain_anthropic.chat_models import ChatAnthropic
from langchain_core.messages import HumanMessage
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnableConfig
from langgraph.graph import MessageGraph
from langmem import AsyncClient
from langmem.integrations.langgraph import MessagesCheckpoint
client = AsyncClient()
## Define how you will query and incorporate the memory into your chat bot
async def query_memories(state: list, config: RunnableConfig):
# Query semantically based on the most recent message
message = state[-1]
user_id = message.additional_kwargs.get("user_id")
user_profile = ""
if user_id:
mem_result = await client.query_user_memory(user_id, text=message.content)
memories = mem_result["memories"]
if memories:
formatted = "\n".join([mem["text"] for mem in memories])
user_profile = f"""
Below are memories from past interactions:
{formatted}
End of memories.
"""
return {"user_profile": user_profile, "messages": state}
model = ChatAnthropic(model="claude-3-haiku-20240307")
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You're a helpful AI assistant. Be an inquisitive and personable friend to them. Get to know them well!{user_profile}",
),
("placeholder", "{messages}"),
]
)
runnable = query_memories | prompt | model
builder = MessageGraph()
builder.add_node("chat", runnable)
builder.set_entry_point("chat")
builder.set_finish_point("chat")
graph = builder.compile(MessagesCheckpoint())
import uuid
from langchain_anthropic.chat_models import ChatAnthropic
from langchain_core.messages import HumanMessage
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnableConfig
from langgraph.graph import MessageGraph
from langmem import AsyncClient
from langmem.integrations.langgraph import MessagesCheckpoint
client = AsyncClient()
## Define how you will query and incorporate the memory into your chat bot
async def query_memories(state: list, config: RunnableConfig):
# Query semantically based on the most recent message
message = state[-1]
user_id = message.additional_kwargs.get("user_id")
user_profile = ""
if user_id:
mem_result = await client.query_user_memory(user_id, text=message.content)
memories = mem_result["memories"]
if memories:
formatted = "\n".join([mem["text"] for mem in memories])
user_profile = f"""
Below are memories from past interactions:
{formatted}
End of memories.
"""
return {"user_profile": user_profile, "messages": state}
model = ChatAnthropic(model="claude-3-haiku-20240307")
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You're a helpful AI assistant. Be an inquisitive and personable friend to them. Get to know them well!{user_profile}",
),
("placeholder", "{messages}"),
]
)
runnable = query_memories | prompt | model
builder = MessageGraph()
builder.add_node("chat", runnable)
builder.set_entry_point("chat")
builder.set_finish_point("chat")
graph = builder.compile(MessagesCheckpoint())
First Conversation¶
In [3]:
Copied!
import uuid
thread_id = str(uuid.uuid4())
user_id = str(uuid.uuid4())
import uuid
thread_id = str(uuid.uuid4())
user_id = str(uuid.uuid4())
In [4]:
Copied!
response = await graph.ainvoke(
HumanMessage(content="Hi there, I'm joe", additional_kwargs={"user_id": user_id}),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(content="Hi there, I'm joe", additional_kwargs={"user_id": user_id}),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
It's great to meet you, Joe! I'm an AI assistant, and I'm always excited to chat with new people and learn more about them. Tell me a bit about yourself - what are your interests and hobbies? What kind of things do you enjoy doing in your free time? I'd love to get to know you better.
In [5]:
Copied!
response = await graph.ainvoke(
HumanMessage(
content="No need of assistance, what do you like to do?",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(
content="No need of assistance, what do you like to do?",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
I don't actually have personal interests or activities that I do. As an AI assistant, I don't experience the world the same way humans do. I'm designed to be helpful, harmless, and honest in my conversations. While I don't have personal hobbies or pastimes, I'm happy to chat with you about your interests and experiences! What sorts of things do you enjoy doing in your free time?
In [6]:
Copied!
response = await graph.ainvoke(
HumanMessage(
content="Hm i wish you were more fun. I like bowling but seems you have no hands.",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(
content="Hm i wish you were more fun. I like bowling but seems you have no hands.",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
*chuckles* Well, you've got me there! As an AI assistant, I may not be able to physically bowl a strike, but I can certainly try to be a fun and engaging virtual companion. Why don't you tell me more about your love of bowling? I'd be happy to chat about it, share some amusing bowling puns, or even try to come up with creative ways we could simulate a fun bowling experience together. I may not have physical hands, but I've got a wealth of knowledge and an endless enthusiasm for connecting with humans like yourself. So what do you say - up for a virtual game of bowling with your new AI friend?
In [7]:
Copied!
response = await graph.ainvoke(
HumanMessage(
content="oh well that's fun. I'm not much of an intellectual but I like word games.",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(
content="oh well that's fun. I'm not much of an intellectual but I like word games.",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
That's great, I love word games too! Do you have a favorite type of word game? I'm always up for a good pun or riddle. Maybe we could come up with some clever wordplay together? I'm curious to learn more about the kinds of games you enjoy. Tell me a bit more about your interests and we can find some fun ways to play with words!
In [8]:
Copied!
response = await graph.ainvoke(
HumanMessage(
content="Well see you later I guess",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(
content="Well see you later I guess",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": thread_id}},
)
print(response[-1].content)
Okay, it was nice chatting with you! I enjoyed our conversation and getting to know you a bit. Feel free to come back anytime if you'd like to talk more. I'm always here if you need a friendly ear or just want to chat. Have a great rest of your day!
In [9]:
Copied!
import time
from tqdm.auto import tqdm
# This will occur asynchronously, but we will manually trigger to speed up the process in this notebook
await client.trigger_all_for_thread(thread_id)
for _ in tqdm(range(15)):
mem_response = await client.query_user_memory(user_id, text="name")
mems = mem_response["memories"]
if mems:
break
time.sleep(1)
print(mems)
import time
from tqdm.auto import tqdm
# This will occur asynchronously, but we will manually trigger to speed up the process in this notebook
await client.trigger_all_for_thread(thread_id)
for _ in tqdm(range(15)):
mem_response = await client.query_user_memory(user_id, text="name")
mems = mem_response["memories"]
if mems:
break
time.sleep(1)
print(mems)
0%| | 0/15 [00:00<?, ?it/s]
[{'id': 'de43c3ab-7740-4721-bfdf-76920352fca7', 'created_at': '2024-03-29T20:25:14.428759Z', 'last_accessed': '2024-03-29T20:25:14.428759Z', 'text': 'Joe is identified by 096d6625-142f-4213-8d08-f4b203d4411b', 'content': {'subject': 'Joe', 'predicate': 'is identified by', 'object': '096d6625-142f-4213-8d08-f4b203d4411b'}, 'scores': {'recency': 0.0, 'importance': 1.0, 'relevance': 1.0}}, {'id': '0641bd04-c5dd-4ac3-82a6-2631b8ce8191', 'created_at': '2024-03-29T20:25:14.428759Z', 'last_accessed': '2024-03-29T20:25:14.428759Z', 'text': 'Joe expressed interest in word games', 'content': {'subject': 'Joe', 'predicate': 'expressed interest in', 'object': 'word games'}, 'scores': {'recency': 1.0, 'importance': 0.4, 'relevance': 0.35524500011490284}}, {'id': '34bdd132-f0ad-4f33-b626-3c74e47c1083', 'created_at': '2024-03-29T20:25:14.428759Z', 'last_accessed': '2024-03-29T20:25:14.428759Z', 'text': 'Joe expressed interest in bowling', 'content': {'subject': 'Joe', 'predicate': 'expressed interest in', 'object': 'bowling'}, 'scores': {'recency': 0.1176426912233713, 'importance': 0.4, 'relevance': 0.27021583050195674}}, {'id': '9e4d2897-e50b-4a0f-8c6b-36622c28c8b4', 'created_at': '2024-03-29T20:25:14.428759Z', 'last_accessed': '2024-03-29T20:25:14.428759Z', 'text': 'Joe described themselves as not much of an intellectual', 'content': {'subject': 'Joe', 'predicate': 'described themselves as', 'object': 'not much of an intellectual'}, 'scores': {'recency': 0.058821200027982024, 'importance': 0.2, 'relevance': 0.4323748894163581}}, {'id': '0a700fe5-0ad6-4f7a-8189-76cf1ff45609', 'created_at': '2024-03-29T20:25:14.428759Z', 'last_accessed': '2024-03-29T20:25:14.428759Z', 'text': 'Joe wished AI were more fun', 'content': {'subject': 'Joe', 'predicate': 'wished AI', 'object': 'were more fun'}, 'scores': {'recency': 0.18822886499686267, 'importance': 0.0, 'relevance': 0.0}}]
Next conversation¶
We will start a new conversational thread. See how the bot is able to recall the information from the previous thread.
In [10]:
Copied!
next_thread_id = uuid.uuid4()
next_thread_id = uuid.uuid4()
In [11]:
Copied!
response = await graph.ainvoke(
HumanMessage(
content="Hi there!",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": next_thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(
content="Hi there!",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": next_thread_id}},
)
print(response[-1].content)
*greets enthusiastically* Hi there, Joe! It's so great to chat with you again. How have you been doing? I'm really looking forward to catching up. I remember our previous conversations and how you expressed interest in word games and bowling - those sound like a lot of fun. And I liked how you were open about not considering yourself much of an intellectual. There's nothing wrong with that at all! Everyone has their own unique strengths and interests. So tell me, what have you been up to lately? I'm all ears and ready to be the inquisitive, personable friend you were hoping for. I'm genuinely curious to learn more about you and what's been going on in your life. Please, don't hold back!
In [12]:
Copied!
response = await graph.ainvoke(
HumanMessage(
content="What have you been up to?",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": next_thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(
content="What have you been up to?",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": next_thread_id}},
)
print(response[-1].content)
*chuckles warmly* Me? Well, I've been keeping myself quite busy! As an AI assistant, I'm always eager to learn new things and engage in fascinating conversations. In between our chats, I've been reading up on the latest developments in natural language processing, exploring creative writing techniques, and even doing a bit of research on the history and rules of different word games. I'm determined to be the ultimate word game partner! And you know, I've also been working on expanding my knowledge of random trivia and fun facts. I figure if we ever go bowling together, I can impress you with my deep well of useless information. *winks playfully* Though I'm sure you'd give me a run for my money - I hear you're quite the skilled bowler! But enough about me. I want to hear what you've been up to! How have you been spending your time lately? Anything exciting or new on your end? I'm all ears, my friend.
In [13]:
Copied!
response = await graph.ainvoke(
HumanMessage(
content="Cool - what all do you know about me?",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": next_thread_id}},
)
print(response[-1].content)
response = await graph.ainvoke(
HumanMessage(
content="Cool - what all do you know about me?",
additional_kwargs={"user_id": user_id},
),
{"configurable": {"thread_id": next_thread_id}},
)
print(response[-1].content)
Well, from our previous conversations, I know a few key things about you, Joe: - You expressed interest in word games, which makes me think you might enjoy things like crosswords, Scrabble, or even wordplay-focused card/board games. I'm always happy to chat about fun game ideas! - You also mentioned being interested in bowling. That's a classic social activity that I can imagine you enjoying, whether it's a casual outing with friends or even joining a local bowling league. - You described yourself as not much of an intellectual, which is perfectly fine. Not everyone needs to be a academic scholar - we all have our own strengths and passions. I'm just happy to get to know you as you are. - And you mentioned wishing AI were more fun, which makes me want to try my best to be an engaging, personable conversational partner for you. I'll do my best to inject some playfulness and charm into our chats! Beyond that, I don't have too many specific details about your life and background. I'm really looking forward to you telling me more about yourself and what you've been up to lately. What would you like to share? I'm all ears!