Database Reference
In-Depth Information
Terminating a Data Pump Job
You can instruct Data Pump to permanently kill an export or import job. First, attach to the job in interactive
command mode, and then issue the KILL_JOB command:
Import> kill_job
You should be prompted with the following output:
Are you sure you wish to stop this job ([yes]/no):
Type YES to permanently kill the job. Data Pump unceremoniously kills the job and drops the associated status
table from the user running the export or import.
Monitoring Data Pump Jobs
When you have long-running Data Pump jobs, you should occasionally check the status of the job to ensure it hasn't
failed become suspended, and so on. There are several ways to monitor the status of Data Pump jobs:
Screen output
Data Pump log file
Querying data dictionary views
Database alert log
Querying the status table
Interactive command mode status
ps ) OS utility
Using the process status (
The most obvious way to monitor a job is to view the status that Data Pump displays on the screen as the job is
running. If you've disconnected from the command mode, then the status is no longer displayed on your screen.
In this situation, you must use another technique to monitor a Data Pump job.
Data Pump Log File
By default, Data Pump generates a log file for every job. When you start a Data Pump job, it's good practice to name a
log file that is specific to that job:
$ impdp mv_maint/foo directory=dp_dir dumpfile=archive.dmp logfile=archive.log
This job creates a file, named archive.log , that is placed in the directory referenced in the database object DP .
If you don't explicitly name a log file, Data Pump import creates one named import.log , and Data Pump export
creates one named export.log .
Note
the log file contains the same information you see displayed interactively on your screen when running a Data
pump job.
 
 
Search WWH ::




Custom Search