79777249

Date: 2025-09-28 09:51:57
Score: 2.5
Natty:
Report link

Here are the 3 runs with your code (with the same model, i.e. gemini-2.5-flash) and different prompts:

1st run: your prompt (What's my name?)

================================ Human Message =================================

Hi! I am Bob!
================================== Ai Message ==================================

Hello Bob! How can I help you today?
================================ Human Message =================================

What's my name?
================================== Ai Message ==================================

I'm sorry, I don't have memory of past conversations. Could you please tell me your name again?

2nd run: prompt (Do you know my name?)

================================ Human Message =================================

Hi! I am Bob!
================================== Ai Message ==================================

Hello Bob! How can I help you today?
================================ Human Message =================================

Do you know my name?
================================== Ai Message ==================================

Yes, your name is Bob.

3rd run: prompt (Do you remember my name?)

================================ Human Message =================================

Hi! I am Bob!
================================== Ai Message ==================================

Hello Bob! How can I help you today?
================================ Human Message =================================

Do you remember my name?
================================== Ai Message ==================================

Yes, I do, Bob!

As you can see, it does have the chat history/memory.

then Why “What’s my name?” fails but “Do you know/remember my name?” works

  1. Gemini (and most LLMs) does not have “structured” memory unless we feed it back.

  2. When you ask “What’s my name?”, the model interprets it literally as a knowledge recall task. Since it doesn’t have an internal persistent memory store, it defaults to “I don’t know your name.”

  3. When you ask “Do you know my name?” or “Do you remember my name?”, the model interprets this more conversationally and looks at the immediate chat history in the same request, so it correctly extracts “Bob”.

So, this is not LangGraph memory failing, it’s a model behavior in Gemini.

The example shown on the official documentaion: https://python.langchain.com/docs/tutorials/agents/ is using anthropic:claude-3-5-sonnet-latest which behaves different from Gemini models.

Here's another examples with the exact same code but with different model llama3.2:latest from Ollama.

import os
from langchain_tavily import TavilySearch
from langgraph.checkpoint.memory import MemorySaver
from langchain_core.messages import HumanMessage
from langgraph.prebuilt import create_react_agent
from langchain_ollama import ChatOllama
from dotenv import load_dotenv

load_dotenv()
os.environ.get('TAVILY_API_KEY')
search = TavilySearch(max_result=2)
tools = [search]

model = ChatOllama(
    model="llama3.2:latest", temperature=0)

memory = MemorySaver()
agent_executor = create_react_agent(model, tools, checkpointer=memory)

# Same thread_id for continuity
config = {"configurable": {"thread_id": "agent003"}}

# First turn
for step in agent_executor.stream(
    {"messages": [HumanMessage("Hi! I am Bob!")]}, config, stream_mode="values"
):
    step["messages"][-1].pretty_print()

# # Second turn – no need to fetch history yourself
for step in agent_executor.stream(
    {"messages": [HumanMessage("what's my name?")]}, config, stream_mode="values"
):
    step["messages"][-1].pretty_print()

output:

================================ Human Message =================================
Hi! I am Bob!
================================== Ai Message ==================================
Tool Calls:
....
================================= Tool Message =================================
Name: tavily_search
....
================================== Ai Message ==================================
Your name is Bob! I've found multiple individuals with the name Bob, including Bob Marley, B.o.B, and Bob Iger. Is there a specific Bob you're interested in learning more about?
Reasons:
  • Blacklisted phrase (0.5): How can I
  • RegEx Blacklisted phrase (2.5): Could you please tell me your
  • Long answer (-1):
  • Has code block (-0.5):
  • Ends in question mark (2):
  • High reputation (-1):
Posted by: Ajeet Verma