Skip to main content

Overview

elizaOS uses Drizzle ORM with PostgreSQL/PGLite and features a powerful dynamic migration system that automatically manages database schema changes at runtime. This guide demonstrates how to add custom tables to your plugins, create repositories for data access, and build actions and providers to interact with your data.

Key Features

  • Automatic Migrations: Schema changes are detected and applied automatically
  • PGLite & PostgreSQL Support: Works with both databases seamlessly
  • Schema Isolation: Each plugin gets its own namespace to avoid conflicts
  • Safety First: Destructive changes are blocked in production by default
  • Zero Configuration: No manual migration files needed
Important: Drizzle ORM version in your plugin must match the monorepo version. Check packages/core/package.json for the exact version required.

Dynamic Migration System

Since ElizaOS 1.0, plugins can define schemas that are automatically migrated without any manual intervention. The system:
  1. Detects Changes: Compares your schema with the database state
  2. Generates SQL: Creates migration statements automatically
  3. Applies Safely: Runs migrations in transactions with rollback capability
  4. Tracks History: Maintains complete audit trail of all schema changes

Database Compatibility

| Database | Development | Production | Features | |------------|-------------|------------|-----------------------------------|| | PGLite | ✅ Recommended | ⚠️ Limited | Fast, in-memory, no setup needed | | PostgreSQL | ✅ Supported | ✅ Recommended | Full features, vector search, scaling |

Version Requirements

Critical: Your plugin’s Drizzle version must match the monorepo version:
// In your plugin's package.json
{
  "dependencies": {
    "drizzle-orm": "^0.44.2" // Must match packages/core version
  }
}
# Check the monorepo's Drizzle version
cat packages/core/package.json | grep drizzle-orm

# Update your plugin to match
bun add drizzle-orm@^0.44.2

Step 1: Define Your Custom Schema

Schema Namespacing

Important: Plugins should use namespaced schemas to avoid conflicts:
  • Core Plugin (@elizaos/plugin-sql): Uses the public schema
  • All Other Plugins: Must use plugin_<name> schema namespace

Creating a Shared Table

To create a table accessible by all agents (no agentId field):
// src/schema.ts
import { pgSchema, pgTable, uuid, text, timestamp, jsonb, index } from 'drizzle-orm/pg-core';

// Create a namespaced schema for your plugin
const pluginSchema = pgSchema('plugin_myplugin');

// Define tables within your schema namespace
export const userPreferencesTable = pluginSchema.table(
  'user_preferences',
  {
    id: uuid('id').primaryKey().defaultRandom(),
    userId: uuid('user_id').notNull(), // Links to the user
    preferences: jsonb('preferences').default({}).notNull(),
    createdAt: timestamp('created_at').defaultNow().notNull(),
    updatedAt: timestamp('updated_at').defaultNow().notNull(),
  },
  (table) => [index('idx_user_preferences_user_id').on(table.userId)]
);

// Export your schema for the plugin
export const schema = {
  userPreferencesTable,
};
Key Points:
  • Use pgSchema('plugin_yourname') for namespace isolation
  • Tables without agentId are shared across all agents
  • Migrations are generated and applied automatically at runtime
  • Indexes are created automatically with the table

Creating Agent-Specific Tables

For data that should be scoped to individual agents:
// Still use the same schema namespace
const pluginSchema = pgSchema('plugin_myplugin');

export const agentDataTable = pluginSchema.table(
  'agent_data',
  {
    id: uuid('id').primaryKey().defaultRandom(),
    agentId: uuid('agent_id').notNull(), // Scopes to specific agent
    key: varchar('key', { length: 255 }).notNull(),
    value: jsonb('value').notNull(),
    createdAt: timestamp('created_at').defaultNow().notNull(),
  },
  (table) => [index('idx_agent_data_agent_key').on(table.agentId, table.key)]
);

PGLite Compatibility Notes

// Some features may need adjustments for PGLite
import { pgTable, uuid, text, vector } from 'drizzle-orm/pg-core';

// For PGLite compatibility, avoid:
// - Complex PostgreSQL-specific functions
// - Some extension types (check PGLite docs)
// - Large vector dimensions (keep under 2000)

// Safe for both PGLite and PostgreSQL:
export const compatibleTable = pluginSchema.table('compatible', {
  id: uuid('id').primaryKey().defaultRandom(),
  content: text('content'),
  // Vector works but check dimension limits
  embedding: vector('embedding', { dimensions: 1536 }),
});

Step 2: Create a Repository for Database Access

Repository Pattern

Create a repository class to handle database operations. This follows the pattern used throughout elizaOS:
// In your plugin's repositories/user-preferences-repository.ts

import { eq } from 'drizzle-orm';
import { drizzle } from 'drizzle-orm/node-postgres';
import { UUID } from '@elizaos/core';
import { userPreferencesTable } from '../schema.ts';

export interface UserPreferences {
  id: UUID;
  userId: UUID;
  preferences: Record<string, any>;
  createdAt: Date;
  updatedAt: Date;
}

export class UserPreferencesRepository {
  constructor(private readonly db: ReturnType<typeof drizzle>) {}

  /**
   * Create or update user preferences
   */
  async upsert(userId: UUID, preferences: Record<string, any>): Promise<UserPreferences> {
    // Check if preferences exist
    const existing = await this.findByUserId(userId);

    if (existing) {
      // Update existing
      const [updated] = await this.db
        .update(userPreferencesTable)
        .set({
          preferences,
          updatedAt: new Date(),
        })
        .where(eq(userPreferencesTable.userId, userId))
        .returning();

      return this.mapToUserPreferences(updated);
    } else {
      // Create new
      const [created] = await this.db
        .insert(userPreferencesTable)
        .values({
          userId,
          preferences,
          createdAt: new Date(),
          updatedAt: new Date(),
        })
        .returning();

      return this.mapToUserPreferences(created);
    }
  }

  /**
   * Find preferences by user ID
   */
  async findByUserId(userId: UUID): Promise<UserPreferences | null> {
    const result = await this.db
      .select()
      .from(userPreferencesTable)
      .where(eq(userPreferencesTable.userId, userId))
      .limit(1);

    return result.length > 0 ? this.mapToUserPreferences(result[0]) : null;
  }

  /**
   * Delete preferences by user ID
   */
  async deleteByUserId(userId: UUID): Promise<boolean> {
    const result = await this.db
      .delete(userPreferencesTable)
      .where(eq(userPreferencesTable.userId, userId))
      .returning();

    return result.length > 0;
  }

  /**
   * Find all preferences (with pagination)
   */
  async findAll(offset = 0, limit = 100): Promise<UserPreferences[]> {
    const results = await this.db.select().from(userPreferencesTable).offset(offset).limit(limit);

    return results.map(this.mapToUserPreferences);
  }

  /**
   * Map database row to domain type
   */
  private mapToUserPreferences(row: any): UserPreferences {
    return {
      id: row.id as UUID,
      userId: row.userId || row.user_id,
      preferences: row.preferences || {},
      createdAt: row.createdAt || row.created_at,
      updatedAt: row.updatedAt || row.updated_at,
    };
  }
}

Advanced Repository Patterns

Transactions

import { pgSchema, uuid, integer, timestamp, sql, eq } from 'drizzle-orm/pg-core';
import { UUID } from '@elizaos/core';

// Define tables with namespace
const pluginSchema = pgSchema('plugin_myplugin');
const userPointsTable = pluginSchema.table('user_points', {
  userId: uuid('user_id').primaryKey(),
  points: integer('points').default(0),
  updatedAt: timestamp('updated_at'),
});
const transactionLogTable = pluginSchema.table('transaction_log', {
  id: uuid('id').primaryKey().defaultRandom(),
  fromUserId: uuid('from_user_id'),
  toUserId: uuid('to_user_id'),
  amount: integer('amount'),
  createdAt: timestamp('created_at'),
});

export class TransactionalRepository {
  async transferPoints(fromUserId: UUID, toUserId: UUID, points: number): Promise<void> {
    await this.db.transaction(async (tx) => {
      // Deduct from sender
      await tx
        .update(userPointsTable)
        .set({
          points: sql`${userPointsTable.points} - ${points}`,
          updatedAt: new Date(),
        })
        .where(eq(userPointsTable.userId, fromUserId));

      // Add to receiver
      await tx
        .update(userPointsTable)
        .set({
          points: sql`${userPointsTable.points} + ${points}`,
          updatedAt: new Date(),
        })
        .where(eq(userPointsTable.userId, toUserId));

      // Log transaction
      await tx.insert(transactionLogTable).values({
        fromUserId,
        toUserId,
        amount: points,
        createdAt: new Date(),
      });
    });
  }
}

Complex Queries

import {
  pgSchema,
  uuid,
  varchar,
  timestamp,
  count,
  countDistinct,
  sql,
  and,
  eq,
  gte,
} from 'drizzle-orm/pg-core';
import { UUID } from '@elizaos/core';

// Define tables with namespace
const pluginSchema = pgSchema('plugin_myplugin');
const userActionsTable = pluginSchema.table('user_actions', {
  id: uuid('id').primaryKey().defaultRandom(),
  userId: uuid('user_id').notNull(),
  actionType: varchar('action_type', { length: 100 }),
  createdAt: timestamp('created_at').defaultNow(),
});

export class AnalyticsRepository {
  async getUserActivityStats(userId: UUID, days = 30): Promise<ActivityStats> {
    const startDate = new Date();
    startDate.setDate(startDate.getDate() - days);

    const stats = await this.db
      .select({
        totalActions: count(userActionsTable.id),
        uniqueDays: countDistinct(sql`DATE(${userActionsTable.createdAt})`),
        mostCommonAction: sql`
          MODE() WITHIN GROUP (ORDER BY ${userActionsTable.actionType})
        `,
      })
      .from(userActionsTable)
      .where(and(eq(userActionsTable.userId, userId), gte(userActionsTable.createdAt, startDate)))
      .groupBy(userActionsTable.userId);

    return stats[0] || { totalActions: 0, uniqueDays: 0, mostCommonAction: null };
  }
}

Step 3: Create an Action to Write Data

Action Structure

Actions process user input and store data using the repository:
import type { Action, IAgentRuntime, Memory, ActionResult } from '@elizaos/core';
import { parseKeyValueXml } from '@elizaos/core';
import { UserPreferencesRepository } from '../repositories/user-preferences-repository.ts';

export const storeUserPreferencesAction: Action = {
  name: 'STORE_USER_PREFERENCES',
  description: 'Extract and store user preferences from messages',

  validate: async (runtime: IAgentRuntime, message: Memory) => {
    const text = message.content.text?.toLowerCase() || '';
    return text.includes('preference') || text.includes('prefer') || text.includes('like');
  },

  handler: async (runtime: IAgentRuntime, message: Memory) => {
    // 1. Create prompt for LLM to extract structured data
    const extractionPrompt = `
      Extract user preferences from the following message.
      Return in XML format:
      
      <preferences>
        <theme>light/dark/auto</theme>
        <language>en/es/fr/etc</language>
        <notifications>true/false</notifications>
        <customPreference>value</customPreference>
      </preferences>
      
      Message: "${message.content.text}"
    `;

    // 2. Use runtime's LLM
    const llmResponse = await runtime.completion({
      messages: [{ role: 'system', content: extractionPrompt }],
    });

    // 3. Parse the response
    const extractedPreferences = parseKeyValueXml(llmResponse.content);

    // 4. Get database and repository
    const db = runtime.databaseAdapter.db;
    const repository = new UserPreferencesRepository(db);

    // 5. Store preferences
    const userId = message.userId || message.entityId;
    const stored = await repository.upsert(userId, extractedPreferences);

    return {
      success: true,
      data: stored,
      text: 'Your preferences have been saved successfully.',
    };
  },
};

Batch Operations Action

export const batchImportAction: Action = {
  name: 'BATCH_IMPORT',
  description: 'Import multiple records at once',

  handler: async (runtime, message) => {
    const db = runtime.databaseAdapter.db;
    const repository = new DataRepository(db);

    // Parse batch data from message
    const records = JSON.parse(message.content.text);

    // Use batch insert for performance
    const results = await db
      .insert(dataTable)
      .values(
        records.map((r) => ({
          ...r,
          createdAt: new Date(),
          updatedAt: new Date(),
        }))
      )
      .returning();

    return {
      success: true,
      text: `Imported ${results.length} records successfully`,
      data: { importedCount: results.length },
    };
  },
};

Step 4: Create a Provider to Read Data

Provider Structure

Providers make data available to agents during conversations:
import type { Provider, IAgentRuntime, Memory } from '@elizaos/core';
import { UserPreferencesRepository } from '../repositories/user-preferences-repository.ts';

export const userPreferencesProvider: Provider = {
  name: 'USER_PREFERENCES',
  description: 'Provides user preferences to customize agent behavior',
  dynamic: true, // Fetches fresh data on each request

  get: async (runtime: IAgentRuntime, message: Memory) => {
    // 1. Get user ID from message
    const userId = message.userId || message.entityId;

    // 2. Get database and repository
    const db = runtime.databaseAdapter.db;
    const repository = new UserPreferencesRepository(db);

    // 3. Fetch preferences
    const userPrefs = await repository.findByUserId(userId);

    if (!userPrefs) {
      return {
        data: { preferences: {} },
        values: { preferences: 'No preferences found' },
        text: '',
      };
    }

    // 4. Format data for agent context
    const preferencesText = `
# User Preferences
${Object.entries(userPrefs.preferences)
  .map(([key, value]) => `- ${key}: ${value}`)
  .join('\n')}
    `.trim();

    return {
      data: { preferences: userPrefs.preferences },
      values: userPrefs.preferences,
      text: preferencesText, // This text is added to agent context
    };
  },
};

Caching Provider

export const cachedDataProvider: Provider = {
  name: 'CACHED_DATA',
  private: true,

  get: async (runtime, message) => {
    const cacheKey = `data_${message.roomId}`;
    const cached = runtime.cacheManager.get(cacheKey);

    if (cached && Date.now() - cached.timestamp < 60000) {
      // 1 minute cache
      return cached.data;
    }

    // Fetch fresh data
    const db = runtime.databaseAdapter.db;
    const repository = new DataRepository(db);
    const freshData = await repository.getRoomData(message.roomId);

    const result = {
      text: formatData(freshData),
      data: freshData,
      values: { roomData: freshData },
    };

    // Cache the result
    runtime.cacheManager.set(cacheKey, {
      data: result,
      timestamp: Date.now(),
    });

    return result;
  },
};

Step 5: Register Your Components

Plugin Configuration

Register your schema with your plugin - migrations run automatically:
import type { Plugin } from '@elizaos/core';
import { schema } from './schema';

export const myPlugin: Plugin = {
  name: '@your-org/plugin-myplugin',
  description: 'My custom plugin with database tables',

  // Register schema - migrations run automatically on plugin load
  schema: schema,

  actions: [storeUserPreferencesAction],
  providers: [userPreferencesProvider],

  async init(runtime) {
    // Migrations have already run by the time init is called
    // Your tables are ready to use
    console.log('Plugin initialized with migrated schema');
  },
};

export default myPlugin;

How Migrations Run

When your plugin loads:
  1. Schema Discovery: System finds your schema definition
  2. Diff Generation: Compares with current database state
  3. Safety Check: Blocks destructive changes in production
  4. Migration: Applies changes in a transaction
  5. Recording: Stores migration history in migrations schema
# Development - migrations run automatically with verbose output
bun run dev

# Production - destructive migrations blocked by default
NODE_ENV=production bun run start

# Allow destructive migrations when needed
ELIZA_ALLOW_DESTRUCTIVE_MIGRATIONS=true bun run start

Important Considerations

1. Drizzle Version Matching

Critical: Your plugin must use the same Drizzle ORM version as the monorepo:
# Check monorepo version (from plugin directory)
grep "drizzle-orm" ../../packages/core/package.json

# Install matching version in your plugin
bun add drizzle-orm@^0.36.0  # Use the exact version from core
Mismatched versions can cause:
  • Migration generation failures
  • Type incompatibilities
  • Runtime errors
  • Schema sync issues

2. Schema Namespacing & Data Patterns

// ✅ CORRECT: Namespaced schema for plugins
const mySchema = pgSchema('plugin_myplugin');
export const myTable = mySchema.table('my_table', {...});

// ❌ WRONG: Using public schema in plugins
export const myTable = pgTable('my_table', {...}); // Goes to public
Without agentId in your tables:
  • All agents can read and write the same data
  • Use userId or other identifiers to scope data appropriately
  • Consider data consistency across multiple agents

3. Database Compatibility

FeaturePGLitePostgreSQLNotes
Basic TablesFull support
IndexesFull support
JSON/JSONBFull support
Vectors⚠️PGLite has dimension limits
ExtensionsPGLite doesn’t support all
Advisory LocksPGLite skips locking

4. Error Handling

try {
  const result = await repository.upsert(userId, preferences);
  return { success: true, data: result };
} catch (error) {
  console.error('Failed to store preferences:', error);
  return {
    success: false,
    error: error instanceof Error ? error.message : 'Unknown error',
  };
}

5. Migration Behavior

// Development environment (default)
// - Verbose logging
// - More permissive with warnings
// - Advisory locks skipped for PGLite

// Production environment
// - Minimal logging
// - Destructive changes blocked
// - Full concurrency protection

// Override destructive change protection
process.env.ELIZA_ALLOW_DESTRUCTIVE_MIGRATIONS = 'true';
Safe Changes (always allowed):
  • Adding new tables
  • Adding nullable columns
  • Adding indexes
  • Extending varchar length
Destructive Changes (require permission):
  • Dropping tables or columns
  • Changing column types
  • Adding NOT NULL to existing columns

Complete Example Workflow

1. Initial Setup

# Create your plugin
elizaos create plugin
# Choose: TypeScript, Database support

# Install matching Drizzle version
cd packages/plugin-myplugin
bun add drizzle-orm@^0.36.0  # Match monorepo version

2. Define Your Schema

// src/schema.ts
import { pgSchema, pgTable, uuid, text } from 'drizzle-orm/pg-core';

const pluginSchema = pgSchema('plugin_myplugin');

export const preferences = pluginSchema.table('preferences', {
  id: uuid('id').primaryKey().defaultRandom(),
  userId: uuid('user_id').notNull(),
  theme: text('theme').default('light'),
});

export default { preferences };

3. Register with Plugin

// src/index.ts
export const plugin: Plugin = {
  name: '@your-org/plugin-myplugin',
  schema: schema, // Migrations run automatically!
  // ... actions, providers, etc
};

4. Runtime Flow

  1. Plugin Loads: Schema detected, migrations run automatically
  2. User Message: “I prefer dark theme and Spanish language”
  3. Action Executes: Stores preferences in database
  4. Provider Reads: Supplies preferences to agent context
  5. Multiple Agents: All agents access the same schema namespace

Advanced Patterns

// Use namespaced schema
const pluginSchema = pgSchema('plugin_myplugin');

export const documentTable = pluginSchema.table('documents', {
  id: uuid('id').primaryKey().defaultRandom(),
  content: text('content').notNull(),
  embedding: vector('embedding', { dimensions: 1536 }),
  metadata: jsonb('metadata').default({}),
});

export class DocumentRepository {
  async searchSimilar(embedding: number[], limit = 10): Promise<Document[]> {
    return await this.db
      .select()
      .from(documentTable)
      .orderBy(sql`${documentTable.embedding} <-> ${embedding}`)
      .limit(limit);
  }
}

Time-Series Data

// Use namespaced schema
const pluginSchema = pgSchema('plugin_myplugin');

export const metricsTable = pluginSchema.table('metrics', {
  id: uuid('id').primaryKey().defaultRandom(),
  metric: varchar('metric', { length: 255 }).notNull(),
  value: real('value').notNull(),
  timestamp: timestamp('timestamp').defaultNow().notNull(),
  tags: jsonb('tags').default({}),
});

export class MetricsRepository {
  async getTimeSeries(metric: string, hours = 24): Promise<TimeSeries> {
    const since = new Date(Date.now() - hours * 60 * 60 * 1000);

    return await this.db
      .select({
        time: metricsTable.timestamp,
        value: avg(metricsTable.value),
      })
      .from(metricsTable)
      .where(and(eq(metricsTable.metric, metric), gte(metricsTable.timestamp, since)))
      .groupBy(sql`DATE_TRUNC('hour', ${metricsTable.timestamp})`)
      .orderBy(metricsTable.timestamp);
  }
}

Troubleshooting Common Issues

”Drizzle version mismatch” Error

# Solution: Match the monorepo version exactly
bun add drizzle-orm@$(grep '"drizzle-orm"' ../../packages/core/package.json | cut -d'"' -f4)

”Schema already exists” in PGLite

// PGLite may cache schemas - restart or use:
const schema = pgSchema('plugin_myplugin_v2'); // Version your schema name

”Destructive migration blocked” in Production

# For intentional schema changes:
ELIZA_ALLOW_DESTRUCTIVE_MIGRATIONS=true NODE_ENV=production bun run start

”Cannot find module ‘drizzle-orm/pg-core‘“

# Ensure drizzle-orm is in dependencies, not devDependencies
bun add drizzle-orm  # Not bun add -d

Migration Not Running

// Ensure schema is exported and registered:
export const plugin: Plugin = {
  name: '@your-org/plugin',
  schema: schema, // Must be defined!
};

Summary

To add custom schema to an elizaOS plugin with automatic migrations:
  1. Match Drizzle Version: Use the same version as the monorepo (bun add drizzle-orm@^0.36.0)
  2. Use Schema Namespacing: Always use pgSchema('plugin_yourname') for isolation
  3. Define Your Tables: Create tables with or without agentId for scoping
  4. Register Schema: Add schema to plugin definition for automatic migrations
  5. Build Components: Create repositories, actions, and providers
  6. Let Migrations Run: System handles everything automatically on startup
No manual migration files needed! The dynamic migration system detects changes and applies them safely, with full rollback support and production safeguards.

See Also