We discussed external tables using SQL*Loader, but let’s take a little bit of time to look at Data Pump as well. This is a utility to unload and load data into the database. It can also provide you with a quick backup, replicate, and secure copy of your data and metadata. You can use Data Pump in a variety of ways:
• Perform point-in-time logical backups of the entire database or subsets of data
• Replicate entire databases or subsets of data for testing or development
• Quickly generate DDL required to re-create objects
• Upgrade a database by exporting from the old version and importing it into the new version
When you are considering ways of uploading, unloading data, and using the different utilities or external tables, you can look at this list of the functionality of Data Pump to help decide what tool works best for the job:
• Performance with large data sets, allowing efficient export and import of data
• Interactive command-line utility, which lets you disconnect and then later attach to active Data Pump jobs along with monitoring job progress
• Ability to export large amounts of data from a remote database and import them directly into a local database without creating a dump file
• Ability to make on-the-fly changes to schemas, tablespaces, data files, and storage settings from export to import
• Sophisticated filtering of objects and data
• Use to perform transportable tablespace export
• Security-controlled (via database) directory objects and data directories
• Advanced features, such as compression and encryption
There are additional ways to move data between databases, as we have discussed in this chapter, but also with pluggable databases, cloning, backup and restores, and other functionality.
Data Pump Architecture
Data Pump consists of the following components:
• Expdp (Data Pump export utility)
• Impdp (Data Pump import utility)
• DBMS_DATAPUMP PL/SQL package (Data Pump application programming interface [API])
• DBMS_METADATA PL/SQL package (Data Pump Metadata API)
The expdp and impdp utilities use the DBMS_DATAPUMP and DBMS_METADATA built-in PL/ SQL packages when exporting and importing data and metadata. The DBMS_DATAPUMP and DBMS_METADATA packages can also be used outside of the Data Pump jobs, useful in monitoring and retrieving DDL statements.
When you start a Data Pump export or import job, a master OS process is initiated, and a database status table is created for the duration of the Data Pump job. There are different modes of the Data Pump job:
• FULL
• SCHEM
• TABLE
• TABLESPACE
• TRANSPORTABLE
For example, if you are exporting a schema, a table is created in your account with the name SYS_EXPORT_SCHEMA_NN, where NN is a number that makes the table name unique in the user’s schema. This status table contains information such as the objects’ exported/imported, start time, elapsed time, rows, and error count. The status table has more than 80 columns.
A Data Pump export creates an export file and a log file. The export file contains the objects being exported. The log file contains a record of the job activities. Export writes data out of the database, and import brings information into the database.
Leave a Reply