List Contents of Dump Files- External Tables

Data Pump has a robust method of creating a file that contains all the SQL that is executed when an import job runs. Data Pump uses the DBMS_METADATA package to create the DDL that you can use to re-create objects in the Data Pump dump file.

Use the SQLFILE option of Data Pump import to list the contents of a Data Pump export file.

This example creates a file named expfull.sql, containing the SQL statements that the import process calls (the file is placed in the directory defined by the DPUMP_DIR2 directory object):

$ impdp hr/Pa$$w0rd123! directory=dpump_dir1 dumpfile=expfull.dmp SQLFILE=dpump_dir2:expfull.sql

If you do not specify a separate directory such as dpump_dir2 in the previous example, then the SQL file is written to the location specified in the DIRECTORY option.

When you use the SQLFILE option with an import, the impdp process does not import any data; it only creates a file that contains the SQL commands that would be run by the import process. It is sometimes handy to generate a SQL file for the following reasons:

•     Preview and verify the SQL statements before running the import.

•     Run the SQL manually to pre-create database objects.

•     Capture the SQL that would be required to re-create database objects (users, tables, index, and so on).

In regard to the last bulleted item, sometimes what is checked into the source code control repository does not match what has really been applied to the production database. This procedure can be handy for troubleshooting or documenting the state of the database at a point in time.

Monitoring Data Pump Jobs

When you have long-running Data Pump jobs, you should occasionally check the status of the job to ensure it has not failed, become suspended, and so on. There are several ways to monitor the status of Data Pump jobs:

•     Screen output

•     Data Pump log file

•     Querying data dictionary views

•     Database alert log

•     Querying the status table

•     Interactive command mode status

•     Using the process status (ps) OS utility

•     Oracle Enterprise Manager

The most obvious way to monitor a job is to view the status that Data Pump displays on the screen as the job is running. If you have disconnected from the command mode, then the status is no longer displayed on your screen. In this situation, you must use another technique to monitor a Data Pump job.

This chapter has covered different ways of loading and unloading data by using external tables in the database, SQL*Loader, and Data Pump options. This helps manage not only the data but the metadata of the database objects.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *