Build a Personal AI Assistant - Part 3: Conversation Memory & Context Management

π Build a Personal AI Assistant
Ad Space
Build a Personal AI Assistant - Part 3: Conversation Memory & Context Management
Welcome to Part 3! In the previous parts, we built a solid foundation for our AI assistant. Now we'll add sophisticated memory capabilities that allow your assistant to remember past conversations, manage context efficiently, and maintain meaningful long-term relationships with users.
Tutorial Navigation
- Previous: Part 2: Core Assistant Logic
- Current: Part 3: Conversation Memory & Context Management
- Next: Part 4: API Integration
- Series: Part 1: Setup | Part 5: Testing & Deployment
What You'll Build
By the end of this tutorial, you'll have:
- β Persistent conversation storage (file-based and SQLite)
- β Intelligent context window management
- β Conversation summarization system
- β Memory optimization strategies
- β Conversation search and retrieval
- β User session management
- β Memory analytics and insights
- β Advanced context trimming algorithms
Estimated Time: 35-40 minutes
Why Memory Matters
Without proper memory management, AI assistants:
- Lose context in long conversations
- Can't reference past discussions
- Consume excessive API tokens
- Provide inconsistent experiences
- Struggle with multi-session interactions
Our memory system will solve these problems by implementing:
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Short-term β β Long-term β β Semantic β
β Memory βββββΆβ Memory βββββΆβ Search β
β (Recent msgs) β β (Summaries) β β (Embeddings) β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
Step 1: Install Additional Dependencies
First, let's install the packages we'll need for our memory system:
# Database and file operations
npm install sqlite3 better-sqlite3
# Utilities
npm install uuid crypto-js
# For future vector search (optional)
npm install --save-dev @types/better-sqlite3
Step 2: Create Memory Storage System
Let's create a flexible storage system that supports both file-based and database storage.
Create src/storage/MemoryStorage.js
import fs from 'fs-extra';
import path from 'path';
import { v4 as uuidv4 } from 'uuid';
import { createLogger } from '../utils/logger.js';
export class MemoryStorage {
constructor(options = {}) {
this.logger = createLogger();
this.storageType = options.type || 'file'; // 'file' or 'database'
this.dataDir = options.dataDir || './data';
this.conversationsFile = path.join(this.dataDir, 'conversations.json');
this.summariesFile = path.join(this.dataDir, 'summaries.json');
this.ensureDirectories();
}
async ensureDirectories() {
try {
await fs.ensureDir(this.dataDir);
// Initialize files if they don't exist
if (!await fs.pathExists(this.conversationsFile)) {
await fs.writeJson(this.conversationsFile, {});
}
if (!await fs.pathExists(this.summariesFile)) {
await fs.writeJson(this.summariesFile, {});
}
this.logger.info('β
Memory storage initialized');
} catch (error) {
this.logger.error('Failed to initialize storage:', error);
throw error;
}
}
/**
* Save a conversation to storage
*/
async saveConversation(conversationId, conversation) {
try {
const conversations = await this.loadConversations();
conversations[conversationId] = {
...conversation,
id: conversationId,
lastUpdated: new Date().toISOString(),
messageCount: conversation.messages ? conversation.messages.length : 0,
};
await fs.writeJson(this.conversationsFile, conversations, { spaces: 2 });
this.logger.debug(`πΎ Conversation ${conversationId} saved`);
return conversationId;
} catch (error) {
this.logger.error('Failed to save conversation:', error);
throw error;
}
}
/**
* Load a specific conversation
*/
async loadConversation(conversationId) {
try {
const conversations = await this.loadConversations();
const conversation = conversations[conversationId];
if (!conversation) {
this.logger.warn(`Conversation ${conversationId} not found`);
return null;
}
this.logger.debug(`π Conversation ${conversationId} loaded`);
return conversation;
} catch (error) {
this.logger.error('Failed to load conversation:', error);
throw error;
}
}
/**
* Load all conversations
*/
async loadConversations() {
try {
return await fs.readJson(this.conversationsFile);
} catch (error) {
this.logger.error('Failed to load conversations:', error);
return {};
}
}
/**
* Delete a conversation
*/
async deleteConversation(conversationId) {
try {
const conversations = await this.loadConversations();
if (conversations[conversationId]) {
delete conversations[conversationId];
await fs.writeJson(this.conversationsFile, conversations, { spaces: 2 });
this.logger.info(`ποΈ Conversation ${conversationId} deleted`);
return true;
}
return false;
} catch (error) {
this.logger.error('Failed to delete conversation:', error);
throw error;
}
}
/**
* Search conversations by content
*/
async searchConversations(query, limit = 10) {
try {
const conversations = await this.loadConversations();
const results = [];
for (const [id, conversation] of Object.entries(conversations)) {
if (this.conversationMatchesQuery(conversation, query)) {
results.push({
id,
...conversation,
relevanceScore: this.calculateRelevance(conversation, query),
});
}
}
// Sort by relevance and limit results
return results
.sort((a, b) => b.relevanceScore - a.relevanceScore)
.slice(0, limit);
} catch (error) {
this.logger.error('Failed to search conversations:', error);
return [];
}
}
/**
* Save conversation summary
*/
async saveSummary(conversationId, summary) {
try {
const summaries = await this.loadSummaries();
summaries[conversationId] = {
...summary,
conversationId,
createdAt: new Date().toISOString(),
};
await fs.writeJson(this.summariesFile, summaries, { spaces: 2 });
this.logger.debug(`π Summary for ${conversationId} saved`);
} catch (error) {
this.logger.error('Failed to save summary:', error);
throw error;
}
}
/**
* Load conversation summary
*/
async loadSummary(conversationId) {
try {
const summaries = await this.loadSummaries();
return summaries[conversationId] || null;
} catch (error) {
this.logger.error('Failed to load summary:', error);
return null;
}
}
/**
* Load all summaries
*/
async loadSummaries() {
try {
return await fs.readJson(this.summariesFile);
} catch (error) {
this.logger.error('Failed to load summaries:', error);
return {};
}
}
/**
* Get storage statistics
*/
async getStats() {
try {
const conversations = await this.loadConversations();
const summaries = await this.loadSummaries();
const totalMessages = Object.values(conversations)
.reduce((sum, conv) => sum + (conv.messageCount || 0), 0);
const totalConversations = Object.keys(conversations).length;
const totalSummaries = Object.keys(summaries).length;
// Calculate storage size
const conversationsSize = await this.getFileSize(this.conversationsFile);
const summariesSize = await this.getFileSize(this.summariesFile);
return {
totalConversations,
totalMessages,
totalSummaries,
storageSize: {
conversations: conversationsSize,
summaries: summariesSize,
total: conversationsSize + summariesSize,
},
};
} catch (error) {
this.logger.error('Failed to get storage stats:', error);
return null;
}
}
// Helper methods
conversationMatchesQuery(conversation, query) {
const searchText = query.toLowerCase();
// Search in conversation metadata
if (conversation.title?.toLowerCase().includes(searchText)) {
return true;
}
// Search in messages
if (conversation.messages) {
return conversation.messages.some(message =>
message.content?.toLowerCase().includes(searchText)
);
}
return false;
}
calculateRelevance(conversation, query) {
let score = 0;
const searchText = query.toLowerCase();
// Title match gets higher score
if (conversation.title?.toLowerCase().includes(searchText)) {
score += 10;
}
// Message content matches
if (conversation.messages) {
conversation.messages.forEach(message => {
if (message.content?.toLowerCase().includes(searchText)) {
score += 1;
}
});
}
// Recent conversations get slight boost
if (conversation.lastUpdated) {
const daysSince = (Date.now() - new Date(conversation.lastUpdated)) / (1000 * 60 * 60 * 24);
score += Math.max(0, 5 - daysSince);
}
return score;
}
async getFileSize(filePath) {
try {
const stats = await fs.stat(filePath);
return stats.size;
} catch (error) {
return 0;
}
}
/**
* Clean up old conversations
*/
async cleanup(options = {}) {
const maxAge = options.maxAgeDays || 30;
const maxConversations = options.maxConversations || 1000;
try {
const conversations = await this.loadConversations();
const conversationIds = Object.keys(conversations);
if (conversationIds.length <= maxConversations) {
return { deleted: 0, kept: conversationIds.length };
}
// Sort by last updated date
const sortedConversations = conversationIds
.map(id => ({ id, ...conversations[id] }))
.sort((a, b) => new Date(b.lastUpdated) - new Date(a.lastUpdated));
// Keep only the most recent conversations
const toKeep = sortedConversations.slice(0, maxConversations);
const toDelete = sortedConversations.slice(maxConversations);
// Rebuild conversations object
const newConversations = {};
toKeep.forEach(conv => {
const { id, ...conversationData } = conv;
newConversations[id] = conversationData;
});
await fs.writeJson(this.conversationsFile, newConversations, { spaces: 2 });
this.logger.info(`π§Ή Cleaned up ${toDelete.length} old conversations`);
return { deleted: toDelete.length, kept: toKeep.length };
} catch (error) {
this.logger.error('Failed to cleanup conversations:', error);
throw error;
}
}
}
Step 3: Create Context Management System
Now let's create a sophisticated context manager that handles conversation windows intelligently.
Create src/memory/ContextManager.js
import { createLogger } from '../utils/logger.js';
import { config } from '../config/assistant.js';
export class ContextManager {
constructor(options = {}) {
this.logger = createLogger();
this.maxContextTokens = options.maxContextTokens || 3000;
this.maxMessages = options.maxMessages || 50;
this.summaryThreshold = options.summaryThreshold || 20;
this.preserveSystemPrompt = options.preserveSystemPrompt !== false;
}
/**
* Optimize conversation context for API calls
*/
async optimizeContext(messages, options = {}) {
try {
// Always preserve system prompt
const systemPrompt = messages.find(msg => msg.role === 'system');
const conversationMessages = messages.filter(msg => msg.role !== 'system');
// If under limits, return as-is
if (conversationMessages.length <= this.maxMessages) {
const estimatedTokens = this.estimateTokens(messages);
if (estimatedTokens <= this.maxContextTokens) {
return messages;
}
}
this.logger.debug('π§ Optimizing conversation context...');
// Apply context optimization strategies
let optimizedMessages = await this.applyContextStrategies(
conversationMessages,
options
);
// Add system prompt back
if (systemPrompt && this.preserveSystemPrompt) {
optimizedMessages = [systemPrompt, ...optimizedMessages];
}
const finalTokens = this.estimateTokens(optimizedMessages);
this.logger.debug(`Context optimized: ${messages.length} β ${optimizedMessages.length} messages, ~${finalTokens} tokens`);
return optimizedMessages;
} catch (error) {
this.logger.error('Context optimization failed:', error);
return messages; // Return original on error
}
}
/**
* Apply various context optimization strategies
*/
async applyContextStrategies(messages, options = {}) {
let optimizedMessages = [...messages];
// Strategy 1: Remove very old messages
if (optimizedMessages.length > this.maxMessages) {
const keepRecent = Math.floor(this.maxMessages * 0.8);
optimizedMessages = optimizedMessages.slice(-keepRecent);
this.logger.debug(`Applied recent message strategy: kept last ${keepRecent} messages`);
}
// Strategy 2: Token-based trimming
let estimatedTokens = this.estimateTokens(optimizedMessages);
if (estimatedTokens > this.maxContextTokens) {
optimizedMessages = await this.trimByTokens(optimizedMessages);
estimatedTokens = this.estimateTokens(optimizedMessages);
}
// Strategy 3: Preserve important messages
if (options.preserveImportant) {
optimizedMessages = this.preserveImportantMessages(optimizedMessages);
}
// Strategy 4: Compress repetitive content
if (options.compressRepetitive) {
optimizedMessages = this.compressRepetitiveContent(optimizedMessages);
}
return optimizedMessages;
}
/**
* Trim messages based on token count
*/
async trimByTokens(messages) {
const targetTokens = Math.floor(this.maxContextTokens * 0.9);
let currentTokens = 0;
const trimmedMessages = [];
// Start from the most recent messages
for (let i = messages.length - 1; i >= 0; i--) {
const message = messages[i];
const messageTokens = this.estimateTokens([message]);
if (currentTokens + messageTokens <= targetTokens) {
trimmedMessages.unshift(message);
currentTokens += messageTokens;
} else {
break;
}
}
this.logger.debug(`Token-based trimming: ${messages.length} β ${trimmedMessages.length} messages`);
return trimmedMessages;
}
/**
* Preserve important messages (questions, errors, key information)
*/
preserveImportantMessages(messages) {
const important = [];
const regular = [];
messages.forEach(message => {
if (this.isImportantMessage(message)) {
important.push(message);
} else {
regular.push(message);
}
});
// Keep all important messages + recent regular messages
const maxRegular = Math.max(0, this.maxMessages - important.length);
const recentRegular = regular.slice(-maxRegular);
// Merge and sort by timestamp
const preserved = [...important, ...recentRegular]
.sort((a, b) => new Date(a.timestamp) - new Date(b.timestamp));
this.logger.debug(`Preserved ${important.length} important messages`);
return preserved;
}
/**
* Check if a message is considered important
*/
isImportantMessage(message) {
const content = message.content?.toLowerCase() || '';
// Questions are important
if (content.includes('?')) return true;
// Errors and warnings
if (content.includes('error') || content.includes('warning')) return true;
// User preferences or settings
if (content.includes('remember') || content.includes('prefer')) return true;
// Assistant providing key information
if (message.role === 'assistant' && content.length > 200) return true;
return false;
}
/**
* Compress repetitive content
*/
compressRepetitiveContent(messages) {
const compressed = [];
let consecutiveCount = 1;
for (let i = 0; i < messages.length; i++) {
const current = messages[i];
const next = messages[i + 1];
// Check if next message is similar
if (next && this.areSimilarMessages(current, next)) {
consecutiveCount++;
continue;
}
// Add the message (possibly with compression note)
if (consecutiveCount > 1) {
compressed.push({
...current,
content: `${current.content} [... ${consecutiveCount - 1} similar messages compressed]`,
metadata: {
...current.metadata,
compressed: true,
originalCount: consecutiveCount,
},
});
} else {
compressed.push(current);
}
consecutiveCount = 1;
}
if (compressed.length < messages.length) {
this.logger.debug(`Compressed ${messages.length - compressed.length} repetitive messages`);
}
return compressed;
}
/**
* Check if two messages are similar enough to compress
*/
areSimilarMessages(msg1, msg2) {
if (msg1.role !== msg2.role) return false;
const content1 = msg1.content?.toLowerCase() || '';
const content2 = msg2.content?.toLowerCase() || '';
// Simple similarity check (can be enhanced with more sophisticated algorithms)
if (content1.length < 20 || content2.length < 20) return false;
const commonWords = this.getCommonWords(content1, content2);
const similarity = commonWords.length / Math.min(content1.split(' ').length, content2.split(' ').length);
return similarity > 0.7;
}
/**
* Get common words between two texts
*/
getCommonWords(text1, text2) {
const words1 = new Set(text1.split(' ').filter(word => word.length > 3));
const words2 = new Set(text2.split(' ').filter(word => word.length > 3));
return [...words1].filter(word => words2.has(word));
}
/**
* Estimate token count for messages
*/
estimateTokens(messages) {
// Rough estimation: ~4 characters per token for English text
const totalChars = messages.reduce((sum, msg) => {
return sum + (msg.content?.length || 0) + (msg.role?.length || 0);
}, 0);
return Math.ceil(totalChars / 4);
}
/**
* Get context statistics
*/
getContextStats(messages) {
const estimatedTokens = this.estimateTokens(messages);
const roles = messages.reduce((counts, msg) => {
counts[msg.role] = (counts[msg.role] || 0) + 1;
return counts;
}, {});
return {
messageCount: messages.length,
estimatedTokens,
roleDistribution: roles,
averageMessageLength: messages.length > 0
? Math.round(estimatedTokens / messages.length * 4)
: 0,
withinLimits: {
messages: messages.length <= this.maxMessages,
tokens: estimatedTokens <= this.maxContextTokens,
},
};
}
/**
* Create conversation summary for long-term memory
*/
async createSummary(messages, options = {}) {
try {
const conversationMessages = messages.filter(msg => msg.role !== 'system');
if (conversationMessages.length < this.summaryThreshold) {
return null;
}
// Extract key information
const summary = {
messageCount: conversationMessages.length,
timespan: this.calculateTimespan(conversationMessages),
topics: this.extractTopics(conversationMessages),
keyPoints: this.extractKeyPoints(conversationMessages),
userPreferences: this.extractUserPreferences(conversationMessages),
sentiment: this.analyzeSentiment(conversationMessages),
createdAt: new Date().toISOString(),
};
this.logger.debug('π Conversation summary created');
return summary;
} catch (error) {
this.logger.error('Failed to create summary:', error);
return null;
}
}
/**
* Extract topics from conversation
*/
extractTopics(messages) {
const topicWords = new Map();
const stopWords = new Set(['the', 'a', 'an', 'and', 'or', 'but', 'in', 'on', 'at', 'to', 'for', 'of', 'with', 'by', 'is', 'are', 'was', 'were', 'be', 'been', 'have', 'has', 'had', 'do', 'does', 'did', 'will', 'would', 'could', 'should', 'may', 'might', 'can']);
messages.forEach(message => {
const words = (message.content || '')
.toLowerCase()
.replace(/[^\w\s]/g, '')
.split(' ')
.filter(word => word.length > 3 && !stopWords.has(word));
words.forEach(word => {
topicWords.set(word, (topicWords.get(word) || 0) + 1);
});
});
// Return top topics
return Array.from(topicWords.entries())
.sort(([, a], [, b]) => b - a)
.slice(0, 10)
.map(([word, count]) => ({ topic: word, frequency: count }));
}
/**
* Extract key points from conversation
*/
extractKeyPoints(messages) {
const keyPoints = [];
messages.forEach(message => {
const content = message.content || '';
// Questions from user
if (message.role === 'user' && content.includes('?')) {
keyPoints.push({
type: 'question',
content: content.slice(0, 100) + '...',
timestamp: message.timestamp,
});
}
// Important information from assistant
if (message.role === 'assistant' && content.length > 100) {
keyPoints.push({
type: 'information',
content: content.slice(0, 100) + '...',
timestamp: message.timestamp,
});
}
});
return keyPoints.slice(-10); // Keep last 10 key points
}
/**
* Extract user preferences from conversation
*/
extractUserPreferences(messages) {
const preferences = [];
const preferenceKeywords = ['prefer', 'like', 'dislike', 'always', 'never', 'remember'];
messages
.filter(msg => msg.role === 'user')
.forEach(message => {
const content = (message.content || '').toLowerCase();
preferenceKeywords.forEach(keyword => {
if (content.includes(keyword)) {
preferences.push({
keyword,
context: message.content.slice(0, 150) + '...',
timestamp: message.timestamp,
});
}
});
});
return preferences;
}
/**
* Analyze conversation sentiment
*/
analyzeSentiment(messages) {
const positiveWords = ['good', 'great', 'excellent', 'amazing', 'wonderful', 'fantastic', 'perfect', 'love', 'like', 'happy', 'pleased'];
const negativeWords = ['bad', 'terrible', 'awful', 'horrible', 'hate', 'dislike', 'frustrated', 'angry', 'disappointed', 'confused'];
let positive = 0;
let negative = 0;
messages.forEach(message => {
const content = (message.content || '').toLowerCase();
positiveWords.forEach(word => {
if (content.includes(word)) positive++;
});
negativeWords.forEach(word => {
if (content.includes(word)) negative++;
});
});
const total = positive + negative;
if (total === 0) return 'neutral';
const sentiment = positive / total;
if (sentiment > 0.6) return 'positive';
if (sentiment < 0.4) return 'negative';
return 'neutral';
}
/**
* Calculate timespan of conversation
*/
calculateTimespan(messages) {
if (messages.length < 2) return null;
const timestamps = messages
.map(msg => msg.timestamp)
.filter(Boolean)
.map(ts => new Date(ts))
.sort((a, b) => a - b);
if (timestamps.length < 2) return null;
const start = timestamps[0];
const end = timestamps[timestamps.length - 1];
const duration = end - start;
return {
start: start.toISOString(),
end: end.toISOString(),
durationMs: duration,
durationHuman: this.formatDuration(duration),
};
}
/**
* Format duration in human-readable format
*/
formatDuration(ms) {
const seconds = Math.floor(ms / 1000);
const minutes = Math.floor(seconds / 60);
const hours = Math.floor(minutes / 60);
const days = Math.floor(hours / 24);
if (days > 0) return `${days} day${days > 1 ? 's' : ''}`;
if (hours > 0) return `${hours} hour${hours > 1 ? 's' : ''}`;
if (minutes > 0) return `${minutes} minute${minutes > 1 ? 's' : ''}`;
return `${seconds} second${seconds > 1 ? 's' : ''}`;
}
}
Step 4: Integrate Memory with Assistant
Now let's enhance our Assistant class to use the new memory system.
Update src/services/Assistant.js
Add these imports at the top:
import { MemoryStorage } from '../storage/MemoryStorage.js';
import { ContextManager } from '../memory/ContextManager.js';
Then add these methods to the Assistant class:
// Add to the constructor
constructor(options = {}) {
// ... existing code ...
// Initialize memory systems
this.memoryStorage = new MemoryStorage({
type: 'file',
dataDir: './data/conversations',
});
this.contextManager = new ContextManager({
maxContextTokens: 3000,
maxMessages: 50,
summaryThreshold: 20,
});
// Current conversation state
this.currentConversationId = null;
this.autoSave = options.autoSave !== false;
this.contextOptimization = options.contextOptimization !== false;
// ... rest of existing code ...
}
/**
* Start a new conversation
*/
async startNewConversation(title = null) {
try {
// Save current conversation if exists
if (this.currentConversationId && this.autoSave) {
await this.saveCurrentConversation();
}
// Generate new conversation ID
this.currentConversationId = uuidv4();
// Reset conversation history
this.conversationHistory = [];
this.isInitialized = false;
// Initialize with system prompt
await this.initialize();
// Set conversation title
if (title) {
this.conversationTitle = title;
}
this.logger.info(`π Started new conversation: ${this.currentConversationId}`);
return this.currentConversationId;
} catch (error) {
this.logger.error('Failed to start new conversation:', error);
throw error;
}
}
/**
* Load an existing conversation
*/
async loadConversation(conversationId) {
try {
const conversation = await this.memoryStorage.loadConversation(conversationId);
if (!conversation) {
throw new Error(`Conversation ${conversationId} not found`);
}
// Save current conversation first
if (this.currentConversationId && this.autoSave) {
await this.saveCurrentConversation();
}
// Load the conversation
this.currentConversationId = conversationId;
this.conversationHistory = conversation.messages || [];
this.conversationTitle = conversation.title || null;
this.isInitialized = true;
this.logger.info(`π Loaded conversation: ${conversationId}`);
return conversation;
} catch (error) {
this.logger.error('Failed to load conversation:', error);
throw error;
}
}
/**
* Save current conversation to storage
*/
async saveCurrentConversation() {
if (!this.currentConversationId) {
return;
}
try {
const conversation = {
id: this.currentConversationId,
title: this.conversationTitle || this.generateConversationTitle(),
messages: this.conversationHistory,
createdAt: this.conversationHistory[0]?.timestamp || new Date().toISOString(),
stats: this.getStats(),
};
await this.memoryStorage.saveConversation(this.currentConversationId, conversation);
// Create summary if conversation is long enough
const summary = await this.contextManager.createSummary(this.conversationHistory);
if (summary) {
await this.memoryStorage.saveSummary(this.currentConversationId, summary);
}
this.logger.debug(`πΎ Conversation saved: ${this.currentConversationId}`);
} catch (error) {
this.logger.error('Failed to save conversation:', error);
}
}
/**
* Enhanced getResponse with memory integration
*/
async getResponse(userInput, options = {}) {
try {
// Ensure assistant is initialized
await this.initialize();
// Generate conversation ID if not exists
if (!this.currentConversationId) {
this.currentConversationId = uuidv4();
}
// Validate input
if (!userInput || typeof userInput !== 'string') {
throw new Error('User input must be a non-empty string');
}
if (userInput.trim().length === 0) {
throw new Error('User input cannot be empty');
}
// Check rate limits
await this.rateLimiter.checkLimit();
// Start timing
const startTime = Date.now();
// Add user message to history
const userMessage = {
role: 'user',
content: userInput.trim(),
timestamp: new Date().toISOString(),
};
this.conversationHistory.push(userMessage);
this.logger.info(`π€ User: ${userInput}`);
// Optimize context if enabled
let contextMessages = this.conversationHistory;
if (this.contextOptimization) {
contextMessages = await this.contextManager.optimizeContext(
this.conversationHistory,
options
);
}
// Generate response with retry logic
const response = await this.retryHandler.execute(
() => this.callOpenAI({ ...options, messages: contextMessages })
);
// Add assistant response to history
const assistantMessage = {
role: 'assistant',
content: response.content,
timestamp: new Date().toISOString(),
metadata: {
model: response.model,
tokensUsed: response.usage?.total_tokens || 0,
responseTime: Date.now() - startTime,
contextOptimized: contextMessages.length !== this.conversationHistory.length,
},
};
this.conversationHistory.push(assistantMessage);
// Update statistics
this.updateStats(assistantMessage.metadata);
// Auto-save if enabled
if (this.autoSave && this.conversationHistory.length % 10 === 0) {
await this.saveCurrentConversation();
}
this.logger.info(`π€ ${config.personality.name}: ${response.content}`);
return {
content: response.content,
metadata: assistantMessage.metadata,
conversationId: this.currentConversationId,
};
} catch (error) {
this.stats.errors++;
this.logger.error('Error generating response:', error);
return {
content: "I'm sorry, I encountered an error while processing your request. Please try again.",
error: true,
errorType: error.name,
metadata: {
responseTime: 0,
tokensUsed: 0,
},
};
}
}
/**
* Search conversation history
*/
async searchConversations(query, limit = 10) {
try {
const results = await this.memoryStorage.searchConversations(query, limit);
this.logger.info(`π Found ${results.length} conversations matching "${query}"`);
return results;
} catch (error) {
this.logger.error('Search failed:', error);
return [];
}
}
/**
* Get memory statistics
*/
async getMemoryStats() {
try {
const storageStats = await this.memoryStorage.getStats();
const contextStats = this.contextManager.getContextStats(this.conversationHistory);
return {
storage: storageStats,
context: contextStats,
currentConversation: {
id: this.currentConversationId,
messageCount: this.conversationHistory.length,
title: this.conversationTitle,
},
};
} catch (error) {
this.logger.error('Failed to get memory stats:', error);
return null;
}
}
/**
* Generate conversation title based on content
*/
generateConversationTitle() {
const userMessages = this.conversationHistory
.filter(msg => msg.role === 'user')
.slice(0, 3);
if (userMessages.length === 0) {
return `Conversation ${new Date().toLocaleDateString()}`;
}
// Use first user message as base for title
const firstMessage = userMessages[0].content;
const words = firstMessage.split(' ').slice(0, 6);
let title = words.join(' ');
if (firstMessage.length > title.length) {
title += '...';
}
return title || `Conversation ${new Date().toLocaleDateString()}`;
}
/**
* Clean up old conversations
*/
async cleanupMemory(options = {}) {
try {
const result = await this.memoryStorage.cleanup(options);
this.logger.info(`π§Ή Memory cleanup completed: ${result.deleted} deleted, ${result.kept} kept`);
return result;
} catch (error) {
this.logger.error('Memory cleanup failed:', error);
throw error;
}
}
Step 5: Update CLI for Memory Management
Let's enhance our CLI to support the new memory features.
Update src/cli/interactive.js
Add these new command handlers to the handleCommand
method:
async handleCommand(command) {
const [cmd, ...args] = command.slice(1).split(' ');
switch (cmd.toLowerCase()) {
// ... existing commands ...
case 'save':
await this.saveConversation(args.join(' '));
break;
case 'load':
await this.loadConversation(args[0]);
break;
case 'list':
await this.listConversations();
break;
case 'search':
await this.searchConversations(args.join(' '));
break;
case 'new':
await this.startNewConversation(args.join(' '));
break;
case 'memory':
await this.displayMemoryStats();
break;
case 'cleanup':
await this.cleanupMemory();
break;
// ... existing default case ...
}
console.log();
this.rl.prompt();
}
async saveConversation(title) {
try {
if (title) {
this.assistant.conversationTitle = title;
}
await this.assistant.saveCurrentConversation();
console.log(chalk.green('πΎ Conversation saved successfully'));
} catch (error) {
console.log(chalk.red(`β Save failed: ${error.message}`));
}
}
async loadConversation(conversationId) {
if (!conversationId) {
console.log(chalk.red('Please provide a conversation ID'));
return;
}
try {
await this.assistant.loadConversation(conversationId);
console.log(chalk.green(`π Loaded conversation: ${conversationId}`));
} catch (error) {
console.log(chalk.red(`β Load failed: ${error.message}`));
}
}
async listConversations() {
try {
const conversations = await this.assistant.memoryStorage.loadConversations();
const conversationList = Object.entries(conversations)
.sort(([,a], [,b]) => new Date(b.lastUpdated) - new Date(a.lastUpdated))
.slice(0, 10);
if (conversationList.length === 0) {
console.log(chalk.gray('No saved conversations found.'));
return;
}
console.log(chalk.cyan('π Recent Conversations:'));
conversationList.forEach(([id, conv], index) => {
const date = new Date(conv.lastUpdated).toLocaleDateString();
const messageCount = conv.messageCount || 0;
const title = conv.title || 'Untitled';
console.log(chalk.white(`${index + 1}. ${title}`));
console.log(chalk.gray(` ID: ${id.slice(0, 8)}... | ${messageCount} messages | ${date}`));
});
} catch (error) {
console.log(chalk.red(`β Failed to list conversations: ${error.message}`));
}
}
async searchConversations(query) {
if (!query) {
console.log(chalk.red('Please provide a search query'));
return;
}
try {
const results = await this.assistant.searchConversations(query);
if (results.length === 0) {
console.log(chalk.gray(`No conversations found matching "${query}"`));
return;
}
console.log(chalk.cyan(`π Found ${results.length} conversation(s) matching "${query}":`));
results.forEach((result, index) => {
const date = new Date(result.lastUpdated).toLocaleDateString();
console.log(chalk.white(`${index + 1}. ${result.title || 'Untitled'}`));
console.log(chalk.gray(` ID: ${result.id.slice(0, 8)}... | Score: ${result.relevanceScore} | ${date}`));
});
} catch (error) {
console.log(chalk.red(`β Search failed: ${error.message}`));
}
}
async startNewConversation(title) {
try {
const conversationId = await this.assistant.startNewConversation(title);
console.log(chalk.green(`π Started new conversation: ${title || conversationId.slice(0, 8)}`));
} catch (error) {
console.log(chalk.red(`β Failed to start new conversation: ${error.message}`));
}
}
async displayMemoryStats() {
try {
const stats = await this.assistant.getMemoryStats();
console.log(chalk.cyan('π§ Memory Statistics:'));
console.log(` π Storage:`);
console.log(` Total conversations: ${stats.storage.totalConversations}`);
console.log(` Total messages: ${stats.storage.totalMessages}`);
console.log(` Total summaries: ${stats.storage.totalSummaries}`);
console.log(` Storage size: ${(stats.storage.storageSize.total / 1024).toFixed(2)} KB`);
console.log(` π§ Current Context:`);
console.log(` Messages: ${stats.context.messageCount}`);
console.log(` Estimated tokens: ${stats.context.estimatedTokens}`);
console.log(` Within limits: ${stats.context.withinLimits.messages && stats.context.withinLimits.tokens ? 'β
' : 'β οΈ'}`);
if (stats.currentConversation.id) {
console.log(` π¬ Current Conversation:`);
console.log(` ID: ${stats.currentConversation.id.slice(0, 8)}...`);
console.log(` Messages: ${stats.currentConversation.messageCount}`);
console.log(` Title: ${stats.currentConversation.title || 'Untitled'}`);
}
} catch (error) {
console.log(chalk.red(`β Failed to get memory stats: ${error.message}`));
}
}
async cleanupMemory() {
try {
console.log(chalk.yellow('π§Ή Cleaning up old conversations...'));
const result = await this.assistant.cleanupMemory({ maxConversations: 100 });
console.log(chalk.green(`β
Cleanup completed: ${result.deleted} deleted, ${result.kept} kept`));
} catch (error) {
console.log(chalk.red(`β Cleanup failed: ${error.message}`));
}
}
// Update the displayHelp method to include new commands
displayHelp() {
console.log(chalk.cyan('Available Commands:'));
console.log(chalk.yellow('/help ') + '- Show this help message');
console.log(chalk.yellow('/stats ') + '- Display conversation statistics');
console.log(chalk.yellow('/history ') + '- Show conversation history');
console.log(chalk.yellow('/clear ') + '- Clear conversation history');
console.log(chalk.yellow('/export ') + '- Export conversation to file');
console.log(chalk.yellow('/settings ') + '- Show current settings');
console.log('');
console.log(chalk.cyan('Memory Commands:'));
console.log(chalk.yellow('/save [title] ') + '- Save current conversation');
console.log(chalk.yellow('/load <id> ') + '- Load saved conversation');
console.log(chalk.yellow('/list ') + '- List recent conversations');
console.log(chalk.yellow('/search <query>') + '- Search conversations');
console.log(chalk.yellow('/new [title] ') + '- Start new conversation');
console.log(chalk.yellow('/memory ') + '- Show memory statistics');
console.log(chalk.yellow('/cleanup ') + '- Clean up old conversations');
console.log('');
console.log(chalk.yellow('/quit ') + '- Exit the assistant');
}
Step 6: Add Memory Tests
Create comprehensive tests for our memory system.
Create tests/MemoryStorage.test.js
import { jest, describe, test, expect, beforeEach, afterEach } from '@jest/globals';
import { MemoryStorage } from '../src/storage/MemoryStorage.js';
import fs from 'fs-extra';
import path from 'path';
describe('MemoryStorage', () => {
let storage;
let testDir;
beforeEach(async () => {
testDir = path.join('./test-data', `test-${Date.now()}`);
storage = new MemoryStorage({ dataDir: testDir });
await storage.ensureDirectories();
});
afterEach(async () => {
if (await fs.pathExists(testDir)) {
await fs.remove(testDir);
}
});
describe('Conversation Management', () => {
test('should save and load conversations', async () => {
const conversationId = 'test-conversation-1';
const conversation = {
title: 'Test Conversation',
messages: [
{ role: 'user', content: 'Hello', timestamp: new Date().toISOString() },
{ role: 'assistant', content: 'Hi there!', timestamp: new Date().toISOString() }
]
};
await storage.saveConversation(conversationId, conversation);
const loaded = await storage.loadConversation(conversationId);
expect(loaded).toBeTruthy();
expect(loaded.title).toBe(conversation.title);
expect(loaded.messages).toHaveLength(2);
expect(loaded.id).toBe(conversationId);
});
test('should return null for non-existent conversation', async () => {
const result = await storage.loadConversation('non-existent');
expect(result).toBeNull();
});
test('should delete conversations', async () => {
const conversationId = 'test-conversation-2';
const conversation = { title: 'Test', messages: [] };
await storage.saveConversation(conversationId, conversation);
const deleted = await storage.deleteConversation(conversationId);
const loaded = await storage.loadConversation(conversationId);
expect(deleted).toBe(true);
expect(loaded).toBeNull();
});
});
describe('Search Functionality', () => {
test('should search conversations by content', async () => {
// Save test conversations
await storage.saveConversation('conv1', {
title: 'JavaScript Discussion',
messages: [{ role: 'user', content: 'Tell me about JavaScript' }]
});
await storage.saveConversation('conv2', {
title: 'Python Tutorial',
messages: [{ role: 'user', content: 'How do I learn Python?' }]
});
const results = await storage.searchConversations('JavaScript');
expect(results).toHaveLength(1);
expect(results[0].title).toBe('JavaScript Discussion');
});
});
describe('Statistics', () => {
test('should return storage statistics', async () => {
await storage.saveConversation('conv1', {
messages: [
{ role: 'user', content: 'Hello' },
{ role: 'assistant', content: 'Hi' }
]
});
const stats = await storage.getStats();
expect(stats.totalConversations).toBe(1);
expect(stats.totalMessages).toBe(2);
expect(stats.storageSize.total).toBeGreaterThan(0);
});
});
});
Create tests/ContextManager.test.js
import { jest, describe, test, expect, beforeEach } from '@jest/globals';
import { ContextManager } from '../src/memory/ContextManager.js';
describe('ContextManager', () => {
let contextManager;
beforeEach(() => {
contextManager = new ContextManager({
maxContextTokens: 1000,
maxMessages: 10,
summaryThreshold: 5
});
});
describe('Context Optimization', () => {
test('should preserve system prompt', async () => {
const messages = [
{ role: 'system', content: 'You are a helpful assistant' },
{ role: 'user', content: 'Hello' },
{ role: 'assistant', content: 'Hi there!' }
];
const optimized = await contextManager.optimizeContext(messages);
expect(optimized[0].role).toBe('system');
expect(optimized).toHaveLength(3);
});
test('should trim messages when over limit', async () => {
const messages = [
{ role: 'system', content: 'System prompt' }
];
// Add many messages to exceed limit
for (let i = 0; i < 20; i++) {
messages.push(
{ role: 'user', content: `Message ${i}`, timestamp: new Date().toISOString() },
{ role: 'assistant', content: `Response ${i}`, timestamp: new Date().toISOString() }
);
}
const optimized = await contextManager.optimizeContext(messages);
expect(optimized.length).toBeLessThan(messages.length);
expect(optimized[0].role).toBe('system'); // System prompt preserved
});
});
describe('Token Estimation', () => {
test('should estimate tokens correctly', () => {
const messages = [
{ role: 'user', content: 'Hello world!' },
{ role: 'assistant', content: 'Hi there, how are you doing today?' }
];
const tokens = contextManager.estimateTokens(messages);
expect(tokens).toBeGreaterThan(0);
expect(typeof tokens).toBe('number');
});
});
describe('Summary Creation', () => {
test('should create summary for long conversations', async () => {
const messages = [];
// Create a conversation long enough to trigger summary
for (let i = 0; i < 10; i++) {
messages.push(
{
role: 'user',
content: `Question ${i} about programming?`,
timestamp: new Date(Date.now() + i * 1000).toISOString()
},
{
role: 'assistant',
content: `Answer ${i} about programming concepts`,
timestamp: new Date(Date.now() + i * 1000 + 500).toISOString()
}
);
}
const summary = await contextManager.createSummary(messages);
expect(summary).toBeTruthy();
expect(summary.messageCount).toBe(20);
expect(summary.topics).toBeTruthy();
expect(summary.keyPoints).toBeTruthy();
expect(summary.sentiment).toBeTruthy();
});
test('should not create summary for short conversations', async () => {
const messages = [
{ role: 'user', content: 'Hello' },
{ role: 'assistant', content: 'Hi' }
];
const summary = await contextManager.createSummary(messages);
expect(summary).toBeNull();
});
});
});
Step 7: Update Environment Configuration
Add memory-related configuration to your .env
file:
# Memory Configuration
MEMORY_STORAGE_TYPE=file
MEMORY_DATA_DIR=./data/conversations
AUTO_SAVE_CONVERSATIONS=true
CONTEXT_OPTIMIZATION=true
# Context Management
MAX_CONTEXT_TOKENS=3000
MAX_CONTEXT_MESSAGES=50
SUMMARY_THRESHOLD=20
# Memory Cleanup
MAX_CONVERSATIONS=1000
CLEANUP_INTERVAL_HOURS=24
Step 8: Testing Your Enhanced Assistant
Run Your Tests
npm test
Test Memory Features
-
Start your assistant:
npm run dev
-
Test conversation saving:
You: Hello, my name is John π€ Alex: Hello John! Nice to meet you... You: /save "Meeting John" πΎ Conversation saved successfully
-
Test conversation listing:
You: /list π Recent Conversations: 1. Meeting John ID: a1b2c3d4... | 2 messages | 8/14/2025
-
Test search functionality:
You: /search "John" π Found 1 conversation(s) matching "John": 1. Meeting John ID: a1b2c3d4... | Score: 11 | 8/14/2025
-
Test memory statistics:
You: /memory π§ Memory Statistics: π Storage: Total conversations: 1 Total messages: 4 Storage size: 1.23 KB
Performance Monitoring
Add Performance Tracking
Monitor your memory system's performance:
// Add to your assistant
async getPerformanceMetrics() {
const memoryStats = await this.getMemoryStats();
return {
memory: memoryStats,
performance: {
averageResponseTime: this.stats.averageResponseTime,
contextOptimizationRate: this.stats.contextOptimizations / this.stats.totalRequests,
memorySavingRate: this.stats.memorySaves / this.stats.totalRequests,
}
};
}
Best Practices for Memory Management
1. Storage Optimization
- Regular Cleanup: Run cleanup operations periodically
- Compression: Compress old conversations to save space
- Archiving: Move very old conversations to cold storage
2. Context Management
- Smart Trimming: Preserve important messages over chronological order
- Summarization: Create summaries instead of keeping full histories
- Token Budgeting: Reserve tokens for important context
3. Search Performance
- Indexing: Consider adding search indexing for large datasets
- Caching: Cache frequently accessed conversations
- Pagination: Implement pagination for large result sets
Troubleshooting Common Issues
Memory Issues
Problem: Application running out of memory Solution:
- Implement conversation cleanup
- Reduce context window size
- Use summarization more aggressively
Problem: Slow conversation loading Solution:
- Optimize JSON file structure
- Consider switching to database storage
- Implement conversation caching
Context Issues
Problem: Important context being lost Solution:
- Adjust importance detection criteria
- Increase context window size
- Improve summarization quality
What We've Built
Congratulations! π You've now added sophisticated memory capabilities to your AI assistant:
Memory Features Implemented
β Persistent Storage - File-based conversation storage β Context Management - Intelligent context window optimization β Conversation Search - Full-text search across conversations β Summarization - Automatic conversation summaries β Memory Analytics - Detailed memory usage statistics β CLI Integration - Memory commands in interactive interface β Performance Monitoring - Memory performance tracking β Cleanup System - Automatic old conversation cleanup
Key Benefits
- Long-term Memory: Your assistant remembers past conversations
- Efficient Context: Smart context management saves API costs
- Search Capability: Find relevant past conversations quickly
- Scalable Storage: Handle thousands of conversations efficiently
- Performance Optimized: Intelligent trimming and compression
Next Steps
In Part 4: API Integration, we'll add:
- External API integrations (weather, calendar, email)
- Function calling capabilities
- Plugin system architecture
- API rate limiting and caching
- Real-world automation examples
Quick Preview of Part 4
// Coming in Part 4
class APIIntegration {
async callWeatherAPI(location) {
// Weather integration
}
async scheduleCalendarEvent(event) {
// Calendar integration
}
async sendEmail(recipient, subject, body) {
// Email integration
}
}
Additional Resources
- Node.js File System Best Practices
- SQLite for Node.js
- Memory Management in Node.js
- OpenAI Token Counting
Ready to continue? Proceed to Part 4: API Integration β
Your assistant now has a sophisticated memory system! It can remember conversations, optimize context automatically, and provide intelligent search capabilities. Test out the new memory features before moving to the next part.
Ad Space
Recommended Tools & Resources
* This section contains affiliate links. We may earn a commission when you purchase through these links at no additional cost to you.
π Featured AI Books
OpenAI API
AI PlatformAccess GPT-4 and other powerful AI models for your agent development.
LangChain Plus
FrameworkAdvanced framework for building applications with large language models.
Pinecone Vector Database
DatabaseHigh-performance vector database for AI applications and semantic search.
AI Agent Development Course
EducationComplete course on building production-ready AI agents from scratch.
π‘ Pro Tip
Start with the free tiers of these tools to experiment, then upgrade as your AI agent projects grow. Most successful developers use a combination of 2-3 core tools rather than trying everything at once.
π Build a Personal AI Assistant
π Join the AgentForge Community
Get weekly insights, tutorials, and the latest AI agent developments delivered to your inbox.
No spam, ever. Unsubscribe at any time.