AI / LangGraph LangChain Interview questions
How do you integrate memory with agents?
By default, AgentExecutor has no memory — each invocation is stateless. To give an agent conversation memory, pass a memory object to AgentExecutor. This is distinct from return_intermediate_steps (which stores tool call history within a single run); memory stores the dialogue across multiple separate invocations.
from langchain.memory import ConversationBufferMemory
from langchain.agents import AgentExecutor
memory = ConversationBufferMemory(
memory_key="chat_history", # must match prompt variable
return_messages=True,
)
executor = AgentExecutor(
agent=agent,
tools=tools,
memory=memory,
verbose=True,
)
executor.invoke({"input": "My name is Alice."})
executor.invoke({"input": "What did I just tell you?"}) # recalls "Alice"
The prompt used by the agent must include a {chat_history} variable (or whatever memory_key is set to) so the history is injected on each call. For multi-user scenarios, each user needs their own memory object — or use LangGraph's checkpointing with thread IDs to manage per-conversation state.
Invest now in Acorns!!! 🚀
Join Acorns and get your $5 bonus!
Acorns is a micro-investing app that automatically invests your "spare change" from daily purchases into diversified, expert-built portfolios of ETFs. It is designed for beginners, allowing you to start investing with as little as $5. The service automates saving and investing. Disclosure: I may receive a referral bonus.
Invest now!!! Get Free equity stock (US, UK only)!
Use Robinhood app to invest in stocks. It is safe and secure. Use the Referral link to claim your free stock when you sign up!.
The Robinhood app makes it easy to trade stocks, crypto and more.
Webull! Receive free stock by signing up using the link: Webull signup.
More Related questions...
