Oracle Data Pump: Comprehensive Guide on expdp and impdp

Oracle Data Pump is a high-performance utility provided by Oracle for efficient data export and import operations. It supersedes the older export/import utilities with enhanced features and capabilities, making data migration, backup, and recovery tasks more streamlined and effective.

Create a Data Pump Directory

To utilize Oracle Data Pump, it is essential to create directories both at the operating system (OS) level and the database level. This is where your dump files will be stored during export and import processes.

1. Create a Directory at OS Level:

Execute the following command to create a directory at the OS level where dump files will be temporarily stored:

mkdir -p /opt/oracle/barode/backup

2. Create a Directory at DB Level:

Once the OS directory is created, define this directory at the database level using SQL commands:

SQL> CREATE DIRECTORY datapump_export AS '/opt/oracle/barode/backup';

3. Grant Permissions to the DB User:

Grant the necessary read and write permissions to the database user who will perform the export and import operations:

SQL> GRANT READ, WRITE ON DIRECTORY datapump_export TO db_user;

Full Database Export and Import

The "FULL" parameter is used for exporting and importing the entire database. This is useful for complete backups or migrations.

Export Full Database:

To export an entire database, use the following command:

expdp system/password@db_instance_name FULL=Y DIRECTORY=datapump_export DUMPFILE=full_db_backup.dmp LOGFILE=full_db_backup.log

Import Full Database:

For importing the full database from a dump file, use this command:

impdp system/password@db_instance_name FULL=Y DIRECTORY=datapump_export DUMPFILE=full_db_backup.dmp LOGFILE=full_db_restore.log

Ensure the user performing these operations has the required roles. The DATAPUMP_EXP_FULL_DATABASE role is necessary for export, and the DATAPUMP_IMP_FULL_DATABASE role is needed for import.

Schema Export and Import

The "SCHEMAS" parameter is used when you want to export or import specific schemas. This is ideal for partial backups or migrations involving specific database schemas.

Export Schema:

To export one or more schemas, use the following command:

expdp system/password@db_instance_name SCHEMAS=db_user DIRECTORY=datapump_export DUMPFILE=db_user_bkp.dmp LOGFILE=db_user_backup.log

Import Schema:

To import a schema, use this command:

impdp system/password@db_instance_name SCHEMAS=db_user DIRECTORY=datapump_export DUMPFILE=db_user_bkp.dmp LOGFILE=db_user_restore.log

Table Export and Import

Use the "TABLES" parameter to export or import specific tables. This can be useful when you need to back up or migrate individual tables.

Export Tables:

To export selected tables, use:

expdp db_user/password@db_instance_name TABLES=MY_TABLE1,MY_TABLE2 DIRECTORY=datapump_export DUMPFILE=my_db_tables.dmp LOGFILE=my_db_tables_backup.log

Import Tables:

To import the exported tables, use:

impdp db_user/password@db_instance_name TABLES=MY_TABLE1,MY_TABLE2 DIRECTORY=datapump_export DUMPFILE=my_db_tables.dmp LOGFILE=my_db_tables_restore.log

To import tables into a different schema or rename tables, use the "REMAP_SCHEMA" and "REMAP_TABLE" parameters respectively:

Remap Schema:

impdp db_user1/password@db_instance_name TABLES=MY_TABLE1,MY_TABLE2 DIRECTORY=datapump_export REMAP_SCHEMA=db_user:db_user1 DUMPFILE=my_db_tables.dmp LOGFILE=my_db_tables_restore.log

Remap Table:

impdp db_user/password@db_instance_name TABLES=MY_TABLE1,MY_TABLE2 DIRECTORY=datapump_export REMAP_TABLE=db_user.MY_TABLE1:NEW_TABLE1,db_user.MY_TABLE2:NEW_TABLE2 DUMPFILE=my_db_tables.dmp LOGFILE=my_db_tables_restore.log

Import Data Only:

If you wish to import only data without creating a table during the import process, you can use the "CONTENT=DATA_ONLY" parameter to instruct Oracle Data Pump to import only the table data. This can be useful if the target table is already present in the schema, and there is a need to only import the data.

impdp db_user/password@db_instance_name TABLES=MY_TABLE1,MY_TABLE2 CONTENT=DATA_ONLY DIRECTORY=datapump_export DUMPFILE=my_db_tables.dmp LOGFILE=my_db_tables_restore.log

Remap Tablespace:

If you want to change the tablespace name during the import process, you can use the "REMAP_TABLESPACE" parameter to instruct Oracle Data Pump to assign a new tablespace. This option can be used during full, schema-level, or table-level imports.

impdp db_user/password@db_instance_name TABLES=MY_TABLE1,MY_TABLE2 REMAP_TABLESPACE=USER_TBS:USER_TBS1 DIRECTORY=datapump_export DUMPFILE=my_db_tables.dmp LOGFILE=my_db_tables_restore.log

Data Pump Performance Tuning

Data Pump performance can be enhanced by utilizing the PARALLEL parameter. This parameter should be used in conjunction with the "%U" wildcard in the DUMPFILE parameter to enable the creation or reading of multiple dumpfiles. The same wildcard can be employed during the import process to reference multiple files. For example:

Export with Parallel Processing:

expdp system/password@db_instance_name SCHEMAS=db_user DIRECTORY=datapump_export DUMPFILE=db_user_bkp_%U.dmp PARALLEL=4 LOGFILE=db_user_backup.log

Import with Parallel Processing:

impdp system/password@db_instance_name SCHEMAS=db_user DIRECTORY=datapump_export DUMPFILE=db_user_bkp_%U.dmp PARALLEL=4 LOGFILE=db_user_restore.log

Adjust the number of parallel sessions based on your server's CPU resources for optimal performance.

Data Pump Export & Import Progress

Monitor the progress of Data Pump jobs to ensure they are running as expected. Use the following SQL query to check the status:


SELECT 
    SID, 
    SERIAL#, 
    USERNAME, 
    CONTEXT, 
    SOFAR, 
    TOTALWORK,
    ROUND(SOFAR/TOTALWORK*100,2) "% COMPLETE"
FROM V$SESSION_LONGOPS 
WHERE TOTALWORK != 0 AND SOFAR <> TOTALWORK;
        

Troubleshooting Oracle Data Pump

While Oracle Data Pump is a robust tool, you might encounter issues during export or import operations. Here are some common problems, associated Oracle error codes, and solutions:

1. Directory Object Not Found:

If you receive errors related to directory objects, it usually means that the directory object specified in the Data Pump command does not exist or is incorrectly defined. Ensure that the directory is correctly created at both the OS and database levels. Verify that the directory path is correct and that the database user has the necessary permissions.

Error Code: ORA-39002: invalid operation

Description: This error often occurs if the directory object defined in your Data Pump job does not exist. It can also happen if the path is incorrect or if the Data Pump job does not have access to the specified directory.

Solution: Check that the directory object is created correctly using:

SQL> CREATE DIRECTORY datapump_export AS '/opt/oracle/barode/backup';

Ensure that the directory path is correct and verify permissions:

SQL> GRANT READ, WRITE ON DIRECTORY datapump_export TO db_user;

2. Insufficient Privileges:

Errors related to privileges occur if the user lacks necessary roles or permissions to perform the Data Pump operations. Ensure that the user has the appropriate roles such as DATAPUMP_EXP_FULL_DATABASE and DATAPUMP_IMP_FULL_DATABASE for full database operations.

Error Code: ORA-39014: missing or invalid export option

Description: This error may appear if the user does not have the required privileges to perform the requested operation.

Solution: Grant the required roles to the user. For full database operations, execute:

SQL> GRANT DATAPUMP_EXP_FULL_DATABASE TO db_user;
SQL> GRANT DATAPUMP_IMP_FULL_DATABASE TO db_user;

3. Network Issues:

For remote database connections, network issues can occur if there are problems with connectivity or network configuration. Ensure that network configurations are correctly set and that there are no connectivity issues between the source and target databases.

Error Code: ORA-12154: TNS:could not resolve the connect identifier specified

Description: This error indicates that the Oracle client cannot resolve the service name specified in the connection string.

Solution: Verify that the tnsnames.ora file is correctly configured and that the service name is correct. Test the connection using:

tnsping your_service_name

Ensure that the network configuration allows communication between the source and target databases. Check firewall settings, network routes, and other relevant configurations.

Best Practices for Oracle Data Pump

Following best practices can help ensure efficient and successful data migration and backup operations using Oracle Data Pump:

1. Use the Latest Version:

Always use the latest version of Oracle Data Pump to benefit from performance improvements and new features.

2. Monitor Resource Usage:

Keep an eye on system resources during large data operations to prevent performance degradation. Adjust parallelism and resource allocation as needed.

3. Perform Regular Backups:

Schedule regular backups using Data Pump to ensure you have recent copies of your data. Consider automating backup processes to minimize manual intervention.

4. Validate Data Post-Import:

After importing data, perform validation checks to ensure the integrity and accuracy of the migrated data. Run necessary queries to verify that all data has been imported correctly.


Related content



Rate Your Experience

: 90 : 1


Last updated in November, 2024

Online Tests
Read more

Cloud Technology
Read more

Oracle Database
Read more

MSSQL Database
Read more

PostGres Database
Read more

Linux
Read more

ASP/C#
Read more

Quick Access