Why do I get an error when I use the DMBS DATA PUMP API in my Amazon RDS for Oracle DB instance?

9 minute read
0

I receive an error when I try to use DBMS_DATAPUMP in my Amazon Relational Database Service (Amazon RDS) for Oracle DB instance.

Short description

DMBS_DATAPUMP fails for the following reasons:

  • Missing user permissions or role
  • Missing read and write permissions on the Oracle Data Pump Directory
  • File permissions issues
  • Source and target version incompatibility or time zone file version mismatch
  • Minor syntax error in the DMBS_DATAPUMP PL/SQL block

The following are common errors that you see when you use DMBS_DATAPUMP:

  • ORA-39001: invalid argument value
  • ORA-31626: job does not exist
  • ORA-39002: invalid operation
  • ORA-39070: Unable to open the log file

Resolution

Use one of the following methods to perform an import or export of an RDS for Oracle DB instance:

If you receive a DBMS_DATAPUMP API error when you perform an import or export, then complete the following troubleshooting steps.

Review the Data Pump import log file contents for errors

Run the following SQL commands to review the import log file and DB alerts for any errors. Replace DATA_PUMP_DIR and <import log filename> with your values.:

-- View Import logs from DATA_PUMP_DIR directory.  
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('DATA_PUMP_DIR','<import log filename>'));

You can also use the following SQL statement to access the BD alert log:

SELECT message_text FROM alertlog;

Check your user permissions

Before you start an export or import job, make sure that your DB user has sufficient permissions. For more information, see Required roles for Oracle Data Pump export and import operations on the Oracle website.

The following are the minimum requirements to run an export or import operation:

  • CREATE SESSION
  • CREATE TABLE
  • READ and WRITE object permissions on an valid directory object
  • Sufficient tablespace quota on the user's default tablespace
  • DATAPUMP_EXP_FULL_DATABASE database role to run a full database Data Pump export job
  • DATAPUMP_IMP_FULL_DATABASE database role to run a full database Data Pump import job

Note: These requirements apply to the user that connects to the database when you run a Data Pump export or import job. They don't apply to the user that's exported or imported.

To check for granted roles and the default role, run the following SQL. Replace <USERNAME> with the username that connects to the database when you run the Data Pump job:

SELECT grantee, granted_role, default_role    FROM dba_role_privs   WHERE grantee IN ('<USERNAME>', 'PUBLIC') ORDER BY 1,2;

To check the granted system permissions, run the following command. Replace <USERNAME> with the username that connects to the database when you run the Data Pump job:

SELECT grantee, privilege    
  FROM dba_sys_privs    
 WHERE (grantee IN ('<USERNAME>', 'PUBLIC')    
  &einbsp;     OR grantee IN (SELECT granted_role FROM dba_role_privs    
                       WHERE grantee IN ('<USERNAME>', 'PUBLIC'))) order by 1;

Grant read and write permissions on the directory

To confirm that you granted read and write permissions on the directory, run the following command. Replace <USERNAME> with the username that connects to the database when you run the Data Pump job:

select PRIVILEGE,TYPE from DBA_TAB_PRIVS where TABLE_NAME='DATA_PUMP_DIR' and GRANTEE='<USERNAME>';

Check the database directory that you use to store the dump and log file

Check the existence of the database directory that you use to store the dump file and the log file. If you use the DATA_PUMP_DIR directory, then run a command similar to the following one:

select PRIVILEGE,TYPE from DBA_TAB_PRIVS where TABLE_NAME='DATA_PUMP_DIR' and GRANTEE='<USERNAME>';

Check the Data Pump Export dump file

To check for the existence of the Data Pump export dump file, run a command similar to the following one:

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) where FILENAME='dumpname.dmp';

Check the Compatibility Matrix

Review Compatibility of Data Pump between different Oracle versions on the Oracle website. You can import the dump file only to a database with the same or higher compatibility level. You might need to import the dump file to a lower version target database. In this case, use the export Data Pump VERSION parameter to match the compatibility level of the target database. For more information, see VERSION on the Oracle website.

This example  command uses the VERSION parameter to match the compatibility of the target database:

DECLARE  
hdnl NUMBER;  
BEGIN  
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null,version => '19.0.0.0');  
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'test_src6.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);  
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'test_src6.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);  
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''USER1'',''USER2'')');  
DBMS_DATAPUMP.START_JOB(hdnl);  
END;  
/

If the time zone file version is higher in the source than in the target, then you might see these errors when you perform an import: 

  • ORA-39002: invalid operation
  • Ora-3905 : Oracle Data Pump does not support importing from a source database with XTSTZ version XX into a target database with TSTZ version XXX

To resolve this issue, you must patch or update the target database with the source time zone file version.

To find the Oracle time zone version, run the following query:

Select name,value$ from sys.props$ where name='DST_PRIMARY_TT_VERSION';

Check the download task log file

If you use Amazon Simple Storage Service (Amazon S3) integration to downloaded the dump file, then check the download task log file. Make sure that the dump file copied without error. Run the following commands to check the download task log file. Replace task-id with the task ID that the upload or download procedures returned:

SQL> SELECT * FROM TABLE(rdsadmin.rds_file_util.listdir(p_directory => 'BDUMP')) where FILENAME like 'dbtask%' order by MTIME;  
SQL> SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-"task-id".log'));   
SQL> SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-1611697375220-44.log'));

Resolve syntax errors

Generally, minor syntax errors cause Data Pump API failures. See the following example of an error that's a result of syntax issues in your database: 

DECLARE  
hdnl NUMBER;  
BEGIN  
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);  
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'TestDumpfile.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);  
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'test_logfile.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);  
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''user1'',''user2'')');  
DBMS_DATAPUMP.START_JOB(hdnl);  
END;  
/  
Error report -  
ORA-39001: invalid argument value  
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79  
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4929  
ORA-06512: at "SYS.DBMS_DATAPUMP", line 6378  
ORA-06512: at line 7  
39001. 00000 -  "invalid argument value"  
*Cause:    The user specified API parameters were of the wrong type or  
           value range.  Subsequent messages supplied by  
           DBMS_DATAPUMP.GET_STATUS will further describe the error.

To resolve these errors, take the following actions:

  • Confirm that the user, tables, and objects that are specified in the command exist in the database.
  • Check that the existing dump file's name is the same as the name that you specify in the command.
  • Before you call DBMS_DATAPUMP.ADD_FILE, make sure that the dump file doesn't exist at the target location. If the dump file exists, then remove the file. Or, make the filename unique for every call.
  • Make sure that the user or schema name is in upper case.
  • If the API uses dblink in the command, then make sure that you can access the database link that's referenced. Also, before you use the dblink command in the DBMS_DATAPUMP API call, make sure that you can run a query from the command.

Use exception handling to capture the detailed error message

In some scenarios, DBMS_DATAPUMP gives a generic error that doesn't provide much detail. Use an exception handling block to get additional information about the error. For examples of exception handling blocks, see Examples of using the Data Pump API on the Oracle website.

Based on the information in the detailed error message, you can then resolve the issue. For example, if the dump file name already exists in DATA_PUMP_DIR, then the exception block gives an error similar to the following one:

"Exception in Data Pump job
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "/rdsdbdata/datapump/example3.dmp
ORA-27038: created file already exists
Additional information: 1"

If you specify the schema as user1, then the exception block gives an error similar to the following one:

"Exception in Data Pump job
ORA-39001: invalid argument value
ORA-39170: Schema expression IN ('user1') does not correspond to any schemas."

Monitor the progress Of Data Pump jobs

The DBA_DATAPUMP_JOBS view shows you whether Data Pump export or import jobs are active or terminated. This is indicated with either a success or failure status. To get detailed information about the Data Pump import or export, query the V$SESSION_LONGOPS data dictionary. For more information, see DBA_DATAPUMP_JOBS and V$SESSION_LONGOPS on the Oracle website.

For example, you can run the following command to check the current status of a Data Pump job and the percentage of work that's completed:

SELECT sl.sid, sl.serial#, sl.sofar, sl.totalwork, dp.owner_name, dp.state, dp.job_mode  
FROM v$session_longops sl, v$datapump_job dp  
WHERE sl.opname = dp.job_name  
AND sl.sofar != sl.totalwork;

Related information

Importing using Oracle Data Pump

DBMS_DATAPUMP on the Oracle website

What are the basic steps in using the Data Pump API? on the Oracle website

AWS OFFICIAL
AWS OFFICIALUpdated 7 months ago
No comments