Database Reference
In-Depth Information
The
expdp
utility creates a file named
exp.dmp
in the
/oradump
directory, containing the information required
to recreate the
INV
table and populate it with data as it was at the time the export was taken. Additionally, a log file
named
exp.log
is created in the
/oradump
directory, containing logging information associated with this export job.
If you don't specify a dump file name, Data Pump creates a file named
expdat.dmp
. If a file named
expdat.dmp
already exists in the directory, then Data Pump throws an error. If you don't specify a log file name, then Data Pump
creates one named
export.log
. If a log file named
export.log
already exists, then Data Pump overwrites it.
■
although it's possible to execute Data pump as the
SYS
user, I don't recommend it for couple of reasons. First,
SYS
is required to connect to the database with the
AS SYSDBA
clause. this requires a Data pump parameter file with the
USERID
parameter and quotes around the associated connect string. this is unwieldy. Second, most tables owned by
SYS
cannot be exported (there are a few exceptions, such as
AUD$
). If you attempt to export a table owned by
SYS
, Data pump
will throw an
ORA-39166
error and indicate that the table doesn't exist. this is confusing.
Tip
Importing a Table
One of the key reasons to export data is so that you can recreate database objects. You may want to do this as part of
a backup strategy or to replicate data to a different database. Data Pump import uses an export dump file as its input
and recreates database objects contained in the export file. The procedure for importing is similar to exporting:
1.
Create a database directory object that points to an OS directory that you want to
read/write Data Pump files from.
2.
Grant read and write privileges on the directory object to the database user running the
export or import.
From the OS prompt, run the
impdp
command.
3.
Steps 1 and 2 were covered in the prior section, “Taking an Export,” and therefore will not be repeated here.
Before running the import job, drop the
INV
table that was created previously.
SQL> drop table inv purge;
Next, recreate the
INV
table from the export taken:
$ impdp mv_maint/foo directory=dp_dir dumpfile=exp.dmp logfile=imp.log
You should now have the
INV
table recreated and populated with data as it was at the time of the export. Now is
a good time to inspect again Figures
8-1
and
8-2
. Make sure you understand which files were created by
expdb
and
which files were used by
impdp
.
Using a Parameter File
Instead of typing commands on the command line, in many situations it's better to store the commands in a file and
then reference the file when executing Data Pump export or import. Using parameter files makes tasks more repeatable
and less prone to error. You can place the commands in a file once and then reference that file multiple times.
Additionally, some Data Pump commands (such as
FLASHBACK_TIME
) require the use of quotation marks; in these
situations, it's sometimes hard to predict how the OS will interpret these. Whenever a command requires quotation
marks, it's highly preferable to use a parameter file.