4 Oracle Data Pump Legacy Mode
With Oracle Data Pump legacy mode, you can use original Export and Import parameters on the Oracle Data Pump Export and Data Pump Import command lines.
- Oracle Data Pump Legacy Mode Use Cases
Oracle Data Pump enters legacy mode when it encounters legacy export or import parameters, so that you can continue using existing scripts. - Parameter Mappings
You can use original Oracle Export and Import parameters when they map to Oracle Data Pump Export and Import parameters that supply similar functionality. - Management of File Locations in Oracle Data Pump Legacy Mode
Original Export and Import and Oracle Data Pump Export and Import differ on where dump files and log files can be written to and read from, because the original version is client-based, and Data Pump is server-based. - Adjusting Existing Scripts for Oracle Data Pump Log Files and Errors
When you use Oracle Data Pump in legacy mode, you must review and update your existing scripts written for original Export and Import
Parent topic: Oracle Data Pump
4.1 Oracle Data Pump Legacy Mode Use Cases
Oracle Data Pump enters legacy mode when it encounters legacy export or import parameters, so that you can continue using existing scripts.
If you use original Export (exp
) and Import (imp
), then you probably have scripts you have been using for many years. Data Pump provides a legacy mode, which allows you to continue to use your existing scripts with Oracle Data Pump.
Oracle Data Pump enters legacy mode when it determines that a parameter unique to original Export or Import is present, either on the command line, or in a script. As Data Pump processes the parameter, the analogous Oracle Data Pump Export or Oracle Data Pump Import parameter is displayed. Oracle strongly recommends that you view the new syntax and make script changes as time permits.
Note:
The Oracle Data Pump Export and Import utilities create and read dump files and log files in Oracle Data Pump format only. They never create or read dump files compatible with original Export or Import. If you have a dump file created with original Export, then you must use original Import to import the data into the database.
Parent topic: Oracle Data Pump Legacy Mode
4.2 Parameter Mappings
You can use original Oracle Export and Import parameters when they map to Oracle Data Pump Export and Import parameters that supply similar functionality.
- Using Original Export Parameters with Oracle Data Pump
Oracle Data Pump Export accepts original Export parameters when they map to a corresponding Oracle Data Pump parameter. - Using Original Import Parameters with Oracle Data Pump
Oracle Data Pump Import accepts original Import parameters when they map to a corresponding Oracle Data Pump parameter.
Parent topic: Oracle Data Pump Legacy Mode
4.2.1 Using Original Export Parameters with Oracle Data Pump
Oracle Data Pump Export accepts original Export parameters when they map to a corresponding Oracle Data Pump parameter.
Oracle Data Pump Interpretation of Original Export Parameters
To see how Oracle Data Pump Export interprets original Export parameters, refer to the table for comparisons. Parameters that have the same name and functionality in both original Export and Oracle Data Pump Export are not included in this table.
Table 4-1 How Oracle Data Pump Export Handles Original Export Parameters
Original Export Parameter | Action Taken by Data Pump Export Parameter |
---|---|
|
This parameter is ignored. |
|
This parameter is ignored. In original Export, the The Oracle Data Pump Export |
|
Oracle Data Pump Export determines the current time, and uses
|
|
If original Export used The default behavior is to include constraints as part of the export. |
|
This parameter is ignored. Oracle Data Pump Export automatically chooses the best export method. |
|
The Oracle Data Pump Export In original Export, feedback was given after a certain number of rows, as specified
with the |
|
Oracle Data Pump Export attempts to determine the path that was specified or
defaulted to for the |
|
If original Export used If original Export used |
|
If original Export used If original Export used |
|
Oracle Data Pump Export attempts to determine the path that was specified or
defaulted to for the Original Export and Import and Data Pump Export and Import differ on where dump files and log files can be written to and read from, because the original version is client-based, and Oracle Data Pump is server-based. The contents of the log file will be those of an Oracle Data Pump Export operation. |
|
This parameter is ignored, because Oracle Data Pump Export processing ensures that each object is in a consistent state when being exported. |
|
The Oracle Data Pump |
|
This parameter is ignored, because Oracle Data Pump Export automatically takes care of buffer sizing. |
|
This parameter is ignored, because Oracle Data Pump Export automatically provides
this functionality to users who have been granted the
|
|
This parameter is ignored, because Oracle Data Pump Export automatically provides
this functionality to users who have been granted the
|
|
This parameter is ignored, because Oracle Data Pump Export automatically provides
this functionality to users who have been granted the
|
|
If original Export used If original Export used |
|
This parameter is ignored, because statistics are always saved for tables as part of an Oracle Data Pump export operation. |
|
If original Export also specified If original Export also specified |
|
If original Export used If original Export used |
|
If original Export used If original Export used |
|
If original Export used If original Export used |
|
When the original Export |
Related Topics
Parent topic: Parameter Mappings
4.2.2 Using Original Import Parameters with Oracle Data Pump
Oracle Data Pump Import accepts original Import parameters when they map to a corresponding Oracle Data Pump parameter.
To see how Oracle Data Pump Import interprets original Export parameters, refer to the table for comparisons. Parameters that have the same name and functionality in both original Import and Oracle Data Pump Import are not included in this table.
Table 4-2 How Oracle Data Pump Import Handles Original Import Parameters
Original Import Parameter | Action Taken by Oracle Data Pump Import Parameter |
---|---|
|
This parameter is ignored. |
|
This parameter was desupported several releases ago, and should no longer be used. Attempting to use this desupported parametr causes the Oracle Data Pump Import operation to abort. |
|
This parameter is ignored. Oracle Data Pump Import automatically performs a commit after each table is processed. |
|
This parameter is ignored. Oracle Data Pump Import compiles procedures after they are created. If necessary for dependencies, a recompile can be run. |
|
If original Import used If original Import used |
|
The Oracle Data Pump Import |
|
If original Import used If original Import used |
|
The Oracle Data Pump Import In original Import, feedback was given after a certain number of rows, as specified
with the |
|
Oracle Data Pump Import attempts to determine the path that was specified or
defaulted to for the Original Export and Import and Data Pump Export and Import differ on where dump files and log files can be written to and read from because the original version is client-based and Data Pump is server-based. |
|
This parameter is ignored, because the information is already contained in the Oracle Data Pump dump file set. |
|
The Oracle Data Pump Import |
|
If original Import used If original Import used |
|
If original Import used If original Import used |
|
If original Import used If original Import used |
|
The Oracle Data Pump Import The same method and attempts made when looking for a directory object described for the If no directory object was specified on the original Import, then Oracle Data Pump
Import uses the directory object specified with the
|
|
Oracle Data Pump Import attempts to determine the path that was specified or
defaulted to for the The contents of the log file will be those of an Oracle Data Pump Import operation. |
|
This parameter is ignored, because Oracle Data Pump handles issues about record length internally. |
|
This parameter is ignored, because this functionality is automatically provided for
users who have been granted the |
|
This parameter is ignored, because this functionality is automatically provided for
users who have been granted the |
|
This parameter is ignored, because this functionality is automatically provided for
users who have been granted the |
|
If original Import used If original Import used |
|
If The file name given is the file name specified on the |
|
This parameter is ignored, because statistics are always saved for tables as part of an Oracle Data Pump Import operation. |
|
This parameter is ignored, because Oracle Data Pump Import automatically determines it; it does not need to be specified. |
|
This parameter is ignored, because Oracle Data Pump Import automatically determines it; it does not need to be specified |
|
If original Import also specified If original Import also specified |
|
This parameter is ignored. OIDs are no longer used for type validation. |
|
The Oracle Data Pump Import The |
|
The |
|
This parameter is ignored because this information is automatically stored in the Oracle Data Pump dump file set. |
|
When the original Import |
Parent topic: Parameter Mappings
4.3 Management of File Locations in Oracle Data Pump Legacy Mode
Original Export and Import and Oracle Data Pump Export and Import differ on where dump files and log files can be written to and read from, because the original version is client-based, and Data Pump is server-based.
Original Export and Import use the FILE
and LOG
parameters to specify dump file and log file names, respectively. These file names always refer to files local to the client system and they may also contain a path specification.
Oracle Data Pump Export and Import use the DUMPFILE
and
LOGFILE
parameters to specify dump file and log file names,
respectively. These file names always refer to files local to the server system, and
cannot contain any path information. Instead, a directory object is used to indirectly
specify path information. The path value defined by the directory object must be
accessible to the server. The directory object is specified for an Oracle Data Pump job
through the DIRECTORY
parameter. It is also possible to prepend a
directory object to the file names passed to the DUMPFILE
and
LOGFILE
parameters. For privileged users, Oracle Data Pump supports
the use of a default directory object if one is not specified on the command line. This
default directory object, DATA_PUMP_DIR
, is set up at installation
time.
If Data Pump legacy mode is enabled and the original Export
FILE=filespec
parameter and/or
LOG=filespec
parameter are
present on the command line, then the following rules of precedence are used to
determine file location:
Note:
If the FILE
parameter and LOG
parameter are both present on the command line, then the rules of precedence are applied separately to each parameter.
When a mix of original Export/Import and Oracle Data Pump Export/Import parameters are used, separate rules apply to them. For example, suppose you have the following command:
expdp system FILE=/user/disk/foo.dmp LOGFILE=foo.log DIRECTORY=dpump_dir
In this case, the Oracle Data Pump legacy mode file management rules, as explained
in this section, apply to the FILE
parameter. The normal (that is,
non-legacy mode) Oracle Data Pump file management rules for default locations of
Dump, Log, and SQL files locations apply to the LOGFILE
parameter.
Example 4-1 Oracle Data Pump Legacy Mode File Management Rules Applied
File management proceeds in the following sequence:
-
If you specify a path location as part of the file specification, then Oracle Data Pump attempts to look for a directory object accessible to the schema running the export job whose path location matches the path location of the file specification. If such a directory object cannot be found, then an error is returned. For example, suppose that you defined a server-based directory object named
USER_DUMP_FILES
with a path value of'/disk1/user1/dumpfiles/'
, and that read and write access to this directory object has been granted to thehr
schema. The following command causes Oracle Data Pump to look for a server-based directory object whose path value contains'/disk1/user1/dumpfiles/'
and to which thehr
schema has been granted read and write access:expdp hr FILE=/disk1/user1/dumpfiles/hrdata.dmp
In this case, Oracle Data Pump uses the directory object
USER_DUMP_FILES
. The path value, in this example'/disk1/user1/dumpfiles/'
, must refer to a path on the server system that is accessible to Oracle Database.If a path location is specified as part of the file specification, then any directory object provided using the
DIRECTORY
parameter is ignored. For example, if you issue the following command, then Oracle Data Pump does not use theDPUMP_DIR
directory object for the file parameter, but instead looks for a server-based directory object whose path value contains'/disk1/user1/dumpfiles/'
and to which thehr
schema has been granted read and write access:expdp hr FILE=/disk1/user1/dumpfiles/hrdata.dmp DIRECTORY=dpump_dir
-
If you have not specified a path location as part of the file specification, then the directory object named by the
DIRECTORY
parameter is used. For example, if you issue the following command, then Oracle Data Pump applies the path location defined for theDPUMP_DIR
directory object to thehrdata.dmp
file:expdp hr FILE=hrdata.dmp DIRECTORY=dpump_dir
-
If you specify no path location as part of the file specification, and no directory object is named by the
DIRECTORY
parameter, then Oracle Data Pump does the following, in the order shown:-
Oracle Data Pump looks for the existence of a directory object of the form
DATA_PUMP_DIR_schema_name
, whereschema_name
is the schema that is running the Oracle Data Pump job. For example, if you issued the following command, then it would cause Oracle Data Pump to look for the existence of a server-based directory object namedDATA_PUMP_DIR_HR
:expdp hr FILE=hrdata.dmp
The
hr
schema also must have been granted read and write access to this directory object. If such a directory object does not exist, then the process moves to step b. -
Oracle Data Pump looks for the existence of the client-based environment variable
DATA_PUMP_DIR
. For instance, suppose that a server-based directory object namedDUMP_FILES1
has been defined, and thehr
schema has been granted read and write access to it. Then on the client system, you can set the environment variableDATA_PUMP_DIR
to point toDUMP_FILES1
as follows:setenv DATA_PUMP_DIR DUMP_FILES1 expdp hr FILE=hrdata.dmp
Oracle Data Pump then uses the served-based directory object
DUMP_FILES1
for thehrdata.dmp
file.If a client-based environment variable
DATA_PUMP_DIR
does not exist, then the process moves to step c. -
If the schema that is running the Oracle Data Pump job has DBA privileges, then the default Oracle Data Pump directory object,
DATA_PUMP_DIR
, is used. This default directory object is established at installation time. For example, the following command causes Oracle Data Pump to attempt to use the defaultDATA_PUMP_DIR
directory object, assuming that system has DBA privileges:expdp system FILE=hrdata.dmp
-
Related Topics
Parent topic: Oracle Data Pump Legacy Mode
4.4 Adjusting Existing Scripts for Oracle Data Pump Log Files and Errors
When you use Oracle Data Pump in legacy mode, you must review and update your existing scripts written for original Export and Import
Oracle Data Pump legacy mode requires that you make adjustments to existing scripts, because of differences in file format and error reporting.
- Log Files
Oracle Data Pump Export and Import do not generate log files in the same format as those created by original Export and Import. - Error Cases
The errors that Oracle Data Pump Export and Import generate can be different from the errors generated by original Export and Import. - Exit Status
Oracle Data Pump Export and Import have enhanced exit status values to enable scripts to better determine the success or failure of export and import jobs.
Parent topic: Oracle Data Pump Legacy Mode
4.4.1 Log Files
Oracle Data Pump Export and Import do not generate log files in the same format as those created by original Export and Import.
You must update any scripts you have that parse the output of original Export and
Import, so that they handle the log file format used by Oracle Data Pump Export and
Import. For example, the message Successfully Terminated
does not
appear in Oracle Data Pump log files.
4.4.2 Error Cases
The errors that Oracle Data Pump Export and Import generate can be different from the errors generated by original Export and Import.
For example, suppose that a parameter that is ignored by Oracle Data Pump Export would have generated an out-of-range value in original Export. In that case, an informational message is written to the log file stating that the parameter is being ignored. However, no value checking is performed, so no error message is generated.
4.4.3 Exit Status
Oracle Data Pump Export and Import have enhanced exit status values to enable scripts to better determine the success or failure of export and import jobs.
Because Oracle Data Pump Export and Import can have different exit status values, Oracle recommends that you review, and if necessary, update, any scripts that look at the exit status.