New from O’Reilly: The memory architecture behind adaptive AI agents

Read the report
Building with LangChain

LLM session management with Redis

About

When you’re building with LLMs, memory matters. Here, Ricardo Ferreira shows you how to give your AI app a brain—by storing and reusing conversation history with LangChain and Redis. See how to connect chat memory to an OpenAI-powered LLM so your app can pick up right where it left off.

18 minutes
Key topics
  1. Build chat memory using LangChain and Redis
  2. Reuse past messages to make LLM responses smarter and more contextual
Speakers
Ricardo Ferreira

Ricardo Ferreira

Principal Developer Advocate

Latest content

See all
Image
Event replays
AI-powered spreadsheets: Let AI agents gather, structure, & act on your data
23 minutes
Image
Event replays
Scaling Raymond James' chatbot from idea to production
27 minutes
Image
Event replays
How Amgen scans millions of documents to develop life-saving drugs faster
29 minutes

Get started with Redis today

Speak to a Redis expert and learn more about enterprise-grade Redis today.