Skip to content

AWS RDS Oracle Database Import using two export dump files

0

I am trying to import data using datapump and I have two export dump files.
I get an error when I run the import. It works fine when there is one export dump file, but with two I get an error, is there a way to import into AWS RDS oracle database using two export dump files

Here is the import I run and the error

DECLARE hdnl NUMBER; BEGIN hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null); DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'expdp_forecaster_prod.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'expdp_gmsg2k_prod.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'gmsg2k-fcstr-scws_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''FCSTR'',''SCWS'', ''GMSG2K'')'); DBMS_DATAPUMP.START_JOB(hdnl); END; /

ERROR at line 1: ORA-39001: invalid argument value ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4929 ORA-06512: at "SYS.DBMS_DATAPUMP", line 5180 ORA-06512: at line 6

3 Answers
0

Yes, it is possible to import data into an AWS RDS Oracle database using multiple export dump files. The error you're encountering is likely due to the way you're specifying the dump files in your import script. Here's how you can modify your script to handle multiple dump files:

  1. Instead of using DBMS_DATAPUMP.ADD_FILE for each dump file, you can use a wildcard pattern to include all relevant dump files.

  2. Ensure that all your dump files are in the DATA_PUMP_DIR directory on your RDS instance.

  3. Modify your PL/SQL script as follows:

DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN(operation => 'IMPORT', job_mode => 'SCHEMA', job_name => null);
DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'expdp_%_prod.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'gmsg2k-fcstr-scws_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl, 'SCHEMA_EXPR', 'IN (''FCSTR'',''SCWS'', ''GMSG2K'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/

In this modified script, we're using 'expdp_%_prod.dmp' as the filename pattern. This will include both 'expdp_forecaster_prod.dmp' and 'expdp_gmsg2k_prod.dmp' files in your import job.

If you still encounter issues, make sure that:

  1. Both dump files are present in the DATA_PUMP_DIR directory.
  2. The user executing the import has the necessary permissions.
  3. There's sufficient storage space in your RDS instance for the import.

Remember to monitor the progress of your import job using views like DBA_DATAPUMP_JOBS or V$SESSION_LONGOPS. If you continue to face issues, review the import log file for more detailed error messages.
Sources
Data migration using Oracle Data Pump - Strategies for Migrating Oracle Databases to AWS
Importing using Oracle Data Pump - Amazon Relational Database Service
Resolve DMBS DATA PUMP API errors for RDS for Oracle DB instances | AWS re:Post

answered a year ago
0

I appreciate your response

I modified the PL/SQL scripts and tried the import, it failed. I have the following exports SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by filename;

FILENAME TYPE FILESIZE MTIME


datapump/ directory 4096 20-OCT-24 expdp_bhardwar_prod.dmp file 344064 18-OCT-24 expdp_forecaster_prod.dmp file 277532672 18-OCT-24

expdp_forecaster_prod.dmp is an export for two schemas 'SCWS' and FCSTR. expdp_bhardwar_prod.dmp is an export for schema BHARDWAR

I tried and import using these two files SQL> DECLARE 2 hdnl NUMBER; BEGIN 3 4 hdnl := DBMS_DATAPUMP.OPEN(operation => 'IMPORT', job_mode => 'SCHEMA', job_name => null); 5 DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'expdp_%_prod.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); 6 DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'bhardwar-fcstr-scws_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); DBMS_DATAPUMP.METADATA_FILTER(hdnl, 'SCHEMA_EXPR', 'IN (''FCSTR'',''SCWS'', ''BHARDWAR'')'); DBMS_DATAPUMP.START_JOB(hdnl); 7 8 9 END; / 10 DECLARE * ERROR at line 1: ORA-39001: invalid argument value ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4929 ORA-06512: at "SYS.DBMS_DATAPUMP", line 5180 ORA-06512: at line 5

I get an error When I import using one dump file it works

DECLARE hdnl NUMBER; BEGIN hdnl := DBMS_DATAPUMP.OPEN(operation => 'IMPORT', job_mode => 'SCHEMA', job_name => null); DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'expdp_forecaster_prod.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'bhardwar-fcstr-scws_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); DBMS_DATAPUMP.METADATA_FILTER(hdnl, 'SCHEMA_EXPR', 'IN (''FCSTR'',''SCWS'')'); DBMS_DATAPUMP.START_JOB(hdnl); END; / PL/SQL procedure successfully completed. SQL> DECLARE hdnl NUMBER; BEGIN hdnl := DBMS_DATAPUMP.OPEN(operation => 'IMPORT', job_mode => 'SCHEMA', job_name => null); 2 DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'expdp_bhardwar_prod.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'bhardwar-fcstr-scws_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); 3 DBMS_DATAPUMP.METADATA_FILTER(hdnl, 'SCHEMA_EXPR', 'IN (''BHARDWAR'')'); DBMS_DATAPUMP.START_JOB(hdnl); END; / 4 5 6 7 8 9 10

PL/SQL procedure successfully completed.

All the three schemas get imported using two separate imports, is there a way to import using multiple dump files in one import.

Thanks

answered a year ago
0

I made the changes and I still get an error.

SQL> DECLARE hdnl NUMBER; BEGIN 2 3 4 hdnl := DBMS_DATAPUMP.OPEN(operation => 'IMPORT', job_mode => 'SCHEMA', job_name => null); DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'expdp_%_prod.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); 5 6 DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'gmsg2k-fcstr-scws_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); DBMS_DATAPUMP.METADATA_FILTER(hdnl, 'SCHEMA_EXPR', 'IN (''FCSTR'',''SCWS'', ''GMSG2K'')'); 7 DBMS_DATAPUMP.START_JOB(hdnl); END; 8 9 / 10 DECLARE * ERROR at line 1: ORA-39001: invalid argument value ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4929 ORA-06512: at "SYS.DBMS_DATAPUMP", line 5180 ORA-06512: at line 5

I checked that the following conditions are met

Both dump files are present in the DATA_PUMP_DIR directory. The user executing the import has the necessary permissions. There's sufficient storage space in your RDS instance for the import.

answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.