Memory
Long-term knowledge storage for recalling past conversations
Memory services let your agents remember information from past conversations, even across different sessions or days. This turns agents from stateless responders into systems that learn from history.
Key difference:
- Session State: Current conversation data (today's booking details, current step)
- Memory: Past conversation knowledge (preferences from last month, previous solutions)
ADK-TS provides two memory implementations: keyword search for development and semantic vector search for production.
Memory Service Interface
All memory services implement the BaseMemoryService interface:
interface BaseMemoryService {
addSessionToMemory(session: Session): Promise<void>;
searchMemory(params: {
appName: string;
userId: string;
query: string;
}): Promise<SearchMemoryResponse>;
}addSessionToMemory: Store a completed conversation in long-term memorysearchMemory: Find relevant past information using natural language queries
Memory Service Options
ADK-TS provides two memory implementations:
InMemoryMemoryService
Simple keyword-based memory for development and testing.
import { InMemoryMemoryService } from "@iqai/adk";
const memoryService = new InMemoryMemoryService();How it works:
- Stores conversations in RAM (lost on restart)
- Searches using exact word matching
- Case-insensitive but doesn't understand synonyms
Example searches:
// ✅ Will match
"TypeScript projects" → matches "I prefer TypeScript for projects"
"prefer typescript" → matches "I prefer TypeScript"
// ❌ Won't match
"TS" → won't find "TypeScript"
"coding language" → won't find "programming language"When to use: Local development, testing, quick prototypes
VertexAiRagMemoryService
Production-grade semantic search using Google Cloud Vertex AI.
import { VertexAiRagMemoryService } from "@iqai/adk";
const memoryService = new VertexAiRagMemoryService(
"projects/my-project/locations/us-central1/ragCorpora/my-corpus-id",
10, // Return top 10 results
0.5 // Similarity threshold (0.0 strict - 1.0 loose)
);How it works:
- Stores conversations in Google Cloud
- Uses vector embeddings for semantic search
- Understands synonyms and context
Example searches:
// ✅ Understands meaning
"luxury beachfront resorts" ≈ "high-end oceanfront accommodations"
"vacation home" ≈ "holiday rental property"
"budget friendly" ≈ "affordable", "economical"Prerequisites:
- Google Cloud project with billing enabled
- Vertex AI API enabled in your project
- RAG Corpus created in Vertex AI (stores your embeddings)
- Authentication configured (Application Default Credentials or service account)
When to use: Production apps requiring intelligent search
See Vertex AI Sessions for detailed Vertex AI setup instructions.
Using Memory
ADK-TS offers two approaches for integrating memory into your agents:
With AgentBuilder (Recommended)
The simplest way to add memory to your agent:
import {
AgentBuilder,
InMemorySessionService,
InMemoryMemoryService,
} from "@iqai/adk";
async function main() {
const sessionService = new InMemorySessionService();
const memoryService = new InMemoryMemoryService();
const { runner } = await AgentBuilder.create("assistant_agent")
.withModel("gpt-4o")
.withInstruction("You are a helpful programming assistant.")
.withSessionService(sessionService, {
userId: "alice",
appName: "my-app",
})
.withMemory(memoryService)
.build();
const response = await runner.ask("What's my favorite programming language?");
console.log(response);
}
main().catch(console.error);With LLMAgent Directly
For more control, configure memory directly:
import { LlmAgent, InMemoryMemoryService } from "@iqai/adk";
const memoryService = new InMemoryMemoryService();
const agent = new LlmAgent({
name: "assistant",
description: "Helpful assistant with memory",
model: "gemini-2.0-flash",
memoryService,
});Storing Conversations in Memory
Memory services don't automatically save conversations—you control when to store them:
// Initialize session and memory services
const sessionService = new InMemorySessionService();
// After a session completes
const session = await sessionService.getSession(
"my-app",
"alice",
"session-123"
);
// Store in memory
if (session) {
await memoryService.addSessionToMemory(session);
console.log("Session stored in memory");
}Memory Search Tools
When you configure an agent with a memory service, the framework can automatically handle memory searches through the context. However, you can also use explicit memory search tools for more control or specialized search logic.
When to use memory tools:
- Using
LlmAgentdirectly (not AgentBuilder) - Need explicit control over memory search timing
- Building custom memory search with specialized logic
- Debugging or tracking when memory is accessed
When NOT needed:
- Using
AgentBuilderwithwithMemory()(automatic memory access)
Built-in LoadMemoryTool
The framework provides LoadMemoryTool for explicit memory search:
import { LlmAgent, LoadMemoryTool } from "@iqai/adk";
const agent = new LlmAgent({
name: "memory_agent",
description: "Agent with memory capabilities",
model: "gemini-2.0-flash",
tools: [new LoadMemoryTool()],
});The agent calls load_memory automatically when it needs past context.
Custom Memory Tools
Create specialized memory search with custom logic:
import {
BaseTool,
type FunctionDeclaration,
type ToolContext,
} from "@iqai/adk";
class PastPurchasesTool extends BaseTool {
constructor() {
super({
name: "search_purchase_history",
description: "Searches the user's past purchases and orders.",
});
}
getDeclaration(): FunctionDeclaration {
return {
name: this.name,
description: this.description,
parameters: {
properties: {
category: {
description: "Product category to search",
},
},
required: ["category"],
},
};
}
async runAsync(args: { category: string }, context: ToolContext) {
const results = await context.searchMemory(`${args.category} purchases`);
return {
purchases: results.memories.map((m) => ({
item: m.content?.parts?.[0]?.text || "",
timestamp: m.timestamp,
})),
};
}
}Direct Memory Search
Search memory directly in your code:
// Initialize memory service
const memoryService = new InMemoryMemoryService();
// Perform a search
const { memories } = await memoryService.searchMemory({
appName: "my-app",
userId: "alice",
query: "favorite restaurants",
});
console.log(`Found ${memories.length} relevant memories`);What Gets Stored in Memory
Memory services automatically filter and store relevant conversation content:
Stored:
- ✅ User messages
- ✅ Agent responses
- ✅ Tool results with content
- ✅ Metadata (author, timestamp)
Not stored:
- ❌ Events without content
- ❌ Empty messages
- ❌ Binary data (use artifacts)