Skip to content

06 ยท Persistent Memory ๐ŸŸก โ€‹

By default agents are stateless โ€” each .run() call starts fresh. Memory lets agents remember users, past conversations, and learned facts โ€” even across server restarts.

What you'll learn โ€‹

  • Short-term (session) memory โ€” remember the current conversation
  • Long-term memory โ€” remember facts across sessions
  • How to scope memory per user

The two types of memory โ€‹

TypeLivesUse for
SessionCurrent conversationMulti-turn chat, context tracking
Long-termForever (persisted)User preferences, past interactions

Code โ€‹

ts
// memory-agent.ts
import { createAgent } from 'confused-ai';
import { InMemoryStore } from 'confused-ai/memory';
import { createStorage } from 'confused-ai/storage';

// โ”€โ”€ Long-term memory store โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
// Use FileStorageAdapter to persist across restarts
const storage = createStorage({
  type: 'file',
  path: './data/memory.json',
});

const longTermMemory = new InMemoryStore({ storage });
await longTermMemory.load(); // restore from disk

// โ”€โ”€ Create the agent โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
const agent = createAgent({
  name: 'memory-agent',
  model: 'gpt-4o-mini',
  instructions: `
    You are a personal assistant that remembers things about users.
    When you learn something new about a user, store it.
    Always greet users by name if you know it.
  `,
  memory: longTermMemory,
  sessionStore: new InMemoryStore(),  // per-conversation context
});

// โ”€โ”€ First conversation โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
const userId = 'user_42';

await agent.run("My name is Alice and I prefer dark mode.", { userId });
// Agent stores: { name: 'Alice', preference: 'dark mode' }

await agent.run("I'm vegetarian and I live in Berlin.", { userId });
// Agent stores: { diet: 'vegetarian', location: 'Berlin' }

// โ”€โ”€ Later conversation (new session, same userId) โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
const result = await agent.run("Recommend a restaurant near me.", { userId });
console.log(result.text);
// โ†’ "Since you're vegetarian and based in Berlin, here are some great options..."

Manual memory operations โ€‹

ts
// Store a fact directly
await longTermMemory.set(`${userId}:name`, 'Alice');
await longTermMemory.set(`${userId}:diet`, 'vegetarian');

// Read a fact
const name = await longTermMemory.get(`${userId}:name`);
console.log(name); // 'Alice'

// List all keys for a user
const keys = await longTermMemory.keys(`${userId}:*`);
console.log(keys); // ['user_42:name', 'user_42:diet', ...]

// Delete a fact
await longTermMemory.delete(`${userId}:diet`);

// Clear everything for a user (e.g., GDPR deletion)
for (const key of keys) {
  await longTermMemory.delete(key);
}

Summarize long conversations โ€‹

For very long conversations, summarize older messages instead of keeping all of them:

ts
const agent = createAgent({
  model: 'gpt-4o-mini',
  sessionStore: new InMemoryStore(),
  maxSessionTokens: 4000,     // when session exceeds this...
  sessionSummarize: true,     // ...auto-summarize older messages
  sessionSummaryModel: 'gpt-4o-mini',
});

Scoped memory per user (multi-tenant) โ€‹

ts
// Each user gets their own isolated memory namespace
function createUserAgent(userId: string) {
  return createAgent({
    name: `agent-${userId}`,
    model: 'gpt-4o-mini',
    instructions: 'You are a personal assistant.',
    memory: longTermMemory,
    memoryNamespace: userId,  // โ† all keys auto-prefixed with userId
  });
}

const agentForAlice = createUserAgent('alice');
const agentForBob   = createUserAgent('bob');
// Alice's memories never leak to Bob

When you have thousands of memories, use vector search instead of key-value lookup:

ts
import { InMemoryVectorStore } from 'confused-ai/memory';
import { OpenAIEmbeddingProvider } from 'confused-ai/memory';

const agent = createAgent({
  model: 'gpt-4o-mini',
  vectorMemory: new InMemoryVectorStore(),
  embeddingProvider: new OpenAIEmbeddingProvider({
    apiKey: process.env.OPENAI_API_KEY!,
  }),
  vectorMemoryTopK: 5,  // recall top 5 relevant memories
});

// Agent automatically embeds + stores every conversation turn
// and retrieves semantically similar past context

What's next? โ€‹

Released under the MIT License.