Migration Guide

Oracle → PostgreSQL Migration Guide

A complete step-by-step guide covering schema conversion, bulk data migration, PL/SQL transpilation, and zero-downtime cutover via CDC.

Requirements

Prerequisites

  • Oracle 11g Release 2 or higher (11.2.0.4+)
  • Oracle LogMiner option enabled on the database
  • DBA or SELECT ANY TABLE privileges on the source schema
  • Python packages: pip install oracledb psycopg2-binary loguru
  • PostgreSQL 13+ target database with a dedicated migration user (CREATEDB, CREATEROLE)
  • Network connectivity between DBMigrateAIPro host and both databases
1

Enable Supplemental Logging

Supplemental logging is required for Oracle LogMiner to capture before/after column values for CDC. Enable it at the database level, then per-table for all tables you want to capture:

-- Enable at database level (run as SYSDBA)
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;

-- Enable per-table (run for each table in the migration)
ALTER TABLE schema_name.table_name ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;

Verify with: SELECT SUPPLEMENTAL_LOG_DATA_MIN FROM V$DATABASE; — should return YES.

2

Capture Start SCN

Before migrating any data, record the current System Change Number (SCN). This is your CDC start point — changes made to Oracle after this SCN will be replayed into PostgreSQL during cutover.

SELECT CURRENT_SCN FROM V$DATABASE;

Save this number.

It is your start_scn in CDC settings. Archive logs must be retained from this SCN until CDC completes.

3

Schema Assessment

Click Assess Schema in the Actions tab. The report in reports/assessment_report.html covers:

XMLTYPE columns

Plan JSONB conversion. Export as CLOB first.

CONNECT BY queries

Replace with WITH RECURSIVE CTEs in views/procedures.

Circular FK dependencies

Enable defer_fks_during_load in settings.

Object counts

Estimate migration time: ~50k rows/sec per worker at defaults.

4

DDL Conversion

DBMigrateAIPro reads Oracle DDL via DBMS_METADATA and converts each object to PostgreSQL syntax. Tables, sequences, indexes, and foreign keys are all handled. Converted DDL scripts are saved to ddl/.

Data Type Mapping

Oracle TypePostgreSQL Type
NUMBER(p, 0) where p ≤ 9INTEGER
NUMBER(p, 0) where p ≤ 18BIGINT
NUMBER(p, s)NUMERIC(p, s)
NUMBERNUMERIC
VARCHAR2(n)VARCHAR(n)
CHAR(n)CHAR(n)
CLOBTEXT
BLOBBYTEA
DATETIMESTAMP
TIMESTAMPTIMESTAMP
TIMESTAMP WITH TIME ZONETIMESTAMPTZ
XMLTYPEJSONB (or TEXT)
RAW(n)BYTEA
FLOATDOUBLE PRECISION
5

Data Migration

DBMigrateAIPro uses parallel workers (default: 4) to stream data from Oracle and load it into PostgreSQL using the native COPY protocol for maximum throughput.

FK deferral

FKs are dropped before load and re-created after to avoid constraint violations

Trigger disabling

PostgreSQL triggers on target tables are disabled during bulk load

Sequence sync

After load, each sequence is reset to MAX(id) + 1 of the loaded data

Progress

Live rows/sec and ETA visible in the Perf tab per worker

6

PL/SQL Conversion

Stored procedures, functions, and package bodies are transpiled to PL/pgSQL. Key transformations applied automatically:

Oracle PL/SQLPostgreSQL PL/pgSQL
Package bodyFunctions prefixed schema.pkg_proc_name()
ROWNUMLIMIT n (in subquery context)
NVL(a, b)COALESCE(a, b)
SYSDATENOW()
CONNECT BYWITH RECURSIVE cte AS (...)
BULK COLLECT INTOSELECT INTO array / RETURN QUERY

Converted PL/SQL scripts are saved to sql/ with inline comments marking each transformation for review.

7

Validation

Validation runs automatically after data load. Open reports/validation_report.htmlto review:

  • Row count comparison — source vs target for every table
  • Optional MD5 checksum per row — enable run_checksum_check: true
  • Sequence current values — compared and synced
  • Index counts — verified in target schema
8

Start CDC for Cutover

Using the SCN captured in Step 2, start the LogMiner stream. DBMigrateAIPro replays all INSERT / UPDATE / DELETE events that occurred on Oracle while bulk migration was running.

# In DBMigrateAIPro CDC tab — set:
source_type: oracle
start_scn:   <SCN from Step 2>
tables:      [SCHEMA.TABLE1, SCHEMA.TABLE2]

# Click "Start CDC"

Monitor lag (seconds behind source) in the CDC panel. Target lag < 5s before cutover.

9

Cutover

  1. 1Stop application writes to Oracle (maintenance window or connection drain)
  2. 2Wait for CDC lag to reach 0 — all changes replayed into PostgreSQL
  3. 3Stop CDC in DBMigrateAIPro
  4. 4Re-enable FK constraints and triggers on PostgreSQL target
  5. 5Update application connection string to point to PostgreSQL
  6. 6Run smoke tests against PostgreSQL — validate critical queries
  7. 7Re-enable application traffic

Troubleshooting

ORA-01291

Supplemental logging not enabled

Run ALTER DATABASE ADD SUPPLEMENTAL LOG DATA; as SYSDBA before starting CDC.

ORA-06550

LogMiner option not available

Requires Oracle LogMiner option. Already handled automatically in DBMigrateAIPro v2.0+.

DataError: invalid input

Type cast errors on NUMBER columns

Check the type mapping table. Override with explicit cast in config/column_overrides.json.

Sequence gap after migration

PostgreSQL sequence starts at 1

DBMigrateAIPro auto-syncs sequences. If gap appears, run: SELECT setval('seq', (SELECT MAX(id) FROM table));

Timezone drift

DATE columns shifted by timezone offset

Oracle DATE has no timezone. Set timezone = 'UTC' in PostgreSQL and ensure app layer is consistent.