Skip to main content

How To Use Datapump Between Different Database Releases

How To Use Datapump Between Different Database Releases


Using Oracle Data Pump between different database releases is a common practice for moving data, especially when upgrading from an older version to a newer version. Data Pump provides the flexibility to export from one database release and import into another, whether the target database is newer or the same version. Here’s a guide on how to use Data Pump (expdp/impdp) between different Oracle database releases.

General Process Overview:

  1. Export the data from the source database using expdp (Data Pump Export).
  2. Import the data into the target database using impdp (Data Pump Import).
  3. Handle any compatibility issues, especially when exporting from a newer database version to an older one.

Data Pump Basics

Data Pump utilities:

  • expdp: Data Pump Export utility used to export data from a database.
  • impdp: Data Pump Import utility used to import data into a database.

Both of these utilities work with dump files to transport data between Oracle databases.

Steps to Use Data Pump Between Different Database Releases

Step 1: Prepare the Directory on Both Databases

On both the source (older version) and target (newer version) databases, you need to have a directory object to store and read the dump files.

  1. Create a Directory on the Source Database:


    CREATE OR REPLACE DIRECTORY dpump_dir AS '/path_to_dumpfile';
    • Replace /path_to_dumpfile with the actual path on the file system where the export file will be stored.
    • Ensure the Oracle user has read/write access to this directory.
  2. Create a Directory on the Target Database:

    On the target database (Oracle 19c, for example), create a similar directory object.


    CREATE OR REPLACE DIRECTORY dpump_dir AS '/path_to_dumpfile';

Step 2: Perform the Export Using expdp

On the source database (let’s assume Oracle 12c), use the expdp command to export the data.

Example command for exporting a schema or table:


expdp username/password@oracle12c \ schemas=your_schema \ directory=dpump_dir \ dumpfile=your_schema_export.dmp \ logfile=export_log.log \ version=12.2 -- optional, if the target is older than 12.2

Explanation:

  • username/password@oracle12c: The credentials for the Oracle 12c source database.
  • schemas=your_schema: Specifies which schema to export. You can also use tables= if you want to export specific tables.
  • directory=dpump_dir: The directory object where the dump file will be stored.
  • dumpfile=your_schema_export.dmp: The dump file that will be created for export.
  • logfile=export_log.log: A log file for tracking the export process.
  • version=12.2: This specifies the version of the Data Pump export. Use this if you are exporting from a higher version database to a lower version database.

Note: If the target database is older than the source, you must specify the version parameter. For example, if you're exporting from Oracle 19c to Oracle 12c, use version=12.2 or lower (depending on the exact version of the target).

Step 3: Transfer the Dump File to the Target System

Once the export is complete, you’ll have a .dmp file (dump file) and a .log file (log file).

  • Transfer the Dump File: You can use FTP, SCP, SFTP, or other file transfer protocols to transfer the dump file from the source system (Oracle 12c) to the target system (Oracle 19c).

    Example:


    scp your_schema_export.dmp user@target_system:/path_to_dumpfile

Step 4: Import the Data Using impdp

On the target database (Oracle 19c), you can import the data using the impdp command.

Example command:


impdp username/password@oracle19c \ schemas=your_schema \ directory=dpump_dir \ dumpfile=your_schema_export.dmp \ logfile=import_log.log \ remap_schema=old_schema:new_schema -- optional

Explanation:

  • username/password@oracle19c: The credentials for the Oracle 19c target database.
  • schemas=your_schema: The schema you want to import.
  • directory=dpump_dir: The directory object where the dump file is located.
  • dumpfile=your_schema_export.dmp: The dump file created during the export process.
  • logfile=import_log.log: A log file for tracking the import process.
  • remap_schema=old_schema:new_schema: Optional. Use this if you want to import the data into a different schema than the original.

Step 5: Verify the Import

After the import is complete, verify that the schema or tables have been successfully imported into the Oracle 19c database by querying the data.


SELECT * FROM your_schema.your_table;

Common Scenarios and Considerations

1. Moving Data from an Older Database to a Newer One (e.g., 12c to 19c)

  • Compatibility: Moving data from an older version like 12c to a newer version (e.g., 19c) is straightforward. Data Pump automatically handles the export/import, and you don’t need to specify the version parameter unless you have specific backward compatibility requirements.

  • Steps:

    • Perform a standard export using expdp in the 12c database.
    • Transfer the dump file to the 19c system.
    • Import the dump file using impdp in the 19c system.

2. Moving Data from a Newer Database to an Older One (e.g., 19c to 12c)

  • Version Parameter: When exporting data from a newer database (19c) to an older one (12c or earlier), you must use the version parameter in the expdp command. This tells Oracle to export the data in a format compatible with the older version.

    Example:


    expdp username/password@oracle19c \ schemas=your_schema \ directory=dpump_dir \ dumpfile=your_schema_export.dmp \ logfile=export_log.log \ version=12.2

    In this example, the export will be compatible with the 12c (12.2) database.

  • Compatibility: Oracle ensures backward compatibility when using the version parameter, but it’s essential to test carefully when moving between major database versions.

3. Cross-Platform Data Pump Export/Import

Data Pump also supports cross-platform migration (e.g., Linux to Windows). The steps are mostly the same, but you need to ensure that the character set and endian formats are compatible between platforms.

If you’re performing a cross-platform migration, use the transportable option to avoid endian issues.


expdp username/password@oracle12c \ transport_tablespaces=tbs_name \ directory=dpump_dir \ dumpfile=tbs_export.dmp \ logfile=export_log.log

Then, use impdp on the target database.

Data Pump Best Practices for Different Releases

  1. Use the Correct Version: Always use the version parameter when exporting to an older database.

  2. Use Data Pump Directories: Ensure that you have valid directory objects in both the source and target databases.

  3. Monitor Logs: Always check the log files (logfile) for warnings or errors.

  4. Network-Based Data Pump (optional): If you don’t want to deal with dump file transfers, you can use network mode to directly export/import between databases without generating intermediate files.

    Example:


    impdp username/password@oracle19c \ network_link=link_to_12c \ schemas=your_schema \ logfile=impdp_network.log
  5. Testing: Perform tests with smaller schemas or tables before migrating larger datasets.

Conclusion

Oracle Data Pump is a flexible and robust tool for migrating data between different Oracle Database releases. Whether you're upgrading from Oracle 12c to Oracle 19c or performing a cross-platform migration, Data Pump simplifies the process with its expdp and impdp utilities. Always ensure you’re using the correct version compatibility, monitor log files, and test migrations in a controlled environment before moving to production.






Please do like and subscribe to my youtube channel: https://www.youtube.com/@foalabs If you like this post please follow,share and comment

Comments

Popular posts from this blog

WebLogic migration to OCI using WDT tool

WebLogic migration to OCI using WDT tool Oracle WebLogic Deploy Tool (WDT) is an open-source project designed to simplify and streamline the management of Oracle WebLogic Server domains. With WDT, you can export configuration and application files from one WebLogic Server domain and import them into another, making it a highly effective tool for tasks like migrating on-premises WebLogic configurations to Oracle Cloud. This blog outlines a detailed step-by-step process for using WDT to migrate WebLogic resources and configurations. Supported WLS versions Why Use WDT for Migration? When moving Oracle WebLogic resources from an on-premises environment to Oracle Cloud (or another WebLogic Server), WDT provides an efficient and reliable approach to: Discover and export domain configurations and application binaries. Create reusable models and archives for deployment in a target domain. Key Pre-Requisites Source System: An Oracle WebLogic Server with pre-configured resources such as: Applica...

How to Validate TDE Wallet Password in Oracle Database

How to Validate TDE Wallet Password in Oracle Database Validating the Transparent Data Encryption (TDE) wallet password is crucial, especially when ensuring that the password is correct without using the OPEN or CLOSE commands in the database. This blog post explains a straightforward method to validate the TDE password using the mkstore utility. Steps to Validate TDE Wallet Password Follow these steps to validate the TDE wallet password: Step 1: Copy the Keystore/Wallet File Navigate to your existing TDE wallet directory. Copy only the ewallet.p12 file to a new directory. If a cwallet.sso file exists, do not copy it . The absence of cwallet.sso ensures that the wallet does not use auto-login, forcing the utility to prompt for the password. Step 2: Validate Using mkstore Use the mkstore utility to check the contents of the wallet file. The mkstore utility will prompt you for the TDE wallet password, allowing you to validate its correctness. Command Syntax To display the conten...

Rename a PDB in Oracle Database Multitenant Architecture in TDE and Non TDE Environment

Rename a PDB in Oracle Database Multitenant Architecture I am sharing a step-by-step guide to help you rename a PDB. This approach uses SQL commands. Without TDE or encryption Wallet Initial Check Check the Current Database Name and Open Mode: SQL > SELECT NAME, OPEN_MODE FROM V$DATABASE; NAME OPEN_MODE --------- -------------------- BEECDB READ WRITE List Current PDBs: SQL > SHOW PDBS; CON_ID CON_NAME OPEN MODE RESTRICTED ---------- ------------------------------ ---------- ---------- 2 PDB$SEED READ ONLY NO 3 FUAT READ WRITE NO We need to RENAME FUAT to BEE  Steps to Rename the PDB Step 1: Export ORACLE_SID Set the Oracle SID to the Container Database (CDB): export ORACLE_SID=BEECDB Step 2: Verify Target PDB Name Availability If the target PDB name is different from the current PDB name, ensure no service exists with the target PDB name. Run SQL to Check Exi...