I want to troubleshoot issues when I integrate Amazon Relational Database Service (Amazon RDS) for Oracle with Amazon Simple Storage Service (Amazon S3).
Short description
Before you transfer files between an Amazon RDS for Oracle database instance and an Amazon S3 bucket, confirm the following requirements:
- The database instance and the S3 bucket are in the same AWS Region.
- The AWS Identity and Access Management (IAM) role for the database instance has the required access to the S3 bucket.
- The database instance is associated with an option group and includes the S3_INTEGRATION option.
- You upload files only from the Oracle directory object.
Note: This directory might be a user-created directory or a DATA_PUMP_DIR directory.
- If you use a user-created directory, then make sure that the user that does the upload and download has the required permissions on the directory.
- If you copy data within subdirectories, then specify the name of the subdirectory explicitly when you use the rdsadmin packages.
Note: You can't copy subdirectories that use the S3_INTEGRATION option. You can only use the S3_INTEGRATION option to copy files between an Oracle logical directory such as DATA_PUMP_DIR and Amazon S3.
Resolution
Note: If you receive errors when you run AWS Command Line Interface (AWS CLI) commands, then see Troubleshoot AWS CLI errors. Also, make sure that you're using the most recent AWS CLI version.
Verify that the IAM role for the database instance has permissions to the S3 bucket
Complete the following steps:
-
Open the Amazon Aurora and RDS console.
-
In the navigation pane, choose Databases.
-
Select the database instance that you want to check.
-
Choose the Connectivity & security tab.
-
Scroll down to the Manage IAM roles section and check whether you have an Active IAM role that includes the following least permissions:
S3:ListBucket
S3:PutObject
S3:GetObject
You can also run the describe-db-instances command:
aws rds describe-db-instances \--db-instance-identifier example-db-instance \--query 'DBInstances[*].[AssociatedRoles[*]]'
Note: Replace example-db-instance with your database instance.
-
If the IAM role exists but doesn't have the least permissions, then update the role to include these permissions.
If you don't have an IAM role with the required permissions, then complete the following steps:
- Create an IAM policy with the permissions required to transfer the files.
Note: Be sure to add bucket level permissions to buckets and object level permissions to objects when you create the IAM policy.
- Create an IAM role for your Amazon RDS for Oracle database instance and attach your policy to the role.
- Associate your IAM role with your Amazon RDS for Oracle database instance.
If the issue with bucket permissions persists, then use the following example policy to grant the IAM role full access to the S3 bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::example-bucket/*",
"arn:aws:s3:::example-bucket"
]
}
]
}
Important: Only grant full access to the S3 bucket to test if the improper setup of the IAM permissions caused the issue. After you resolve the issue, return the custom permissions to their previous state.
To upload files to a specific folder in Amazon S3, use the p_s3_prefix parameter in the rdsadmin.rdsadmin_s3_tasks.upload_to_s3 procedure. Be sure that you have the required permissions before you upload the files.
For example, to upload files to the example-folder folder to the S3 bucket example-bucket, attach the following IAM policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::example-bucket/example-folder/*"
]
}
]
}
Verify that you have an option group attached to your RDS database instance
Complete the following steps:
-
Open the Aurora and RDS console.
-
In the navigation pane, choose Databases.
-
Choose the database instance that you want to check.
-
Choose the Configuration tab and take note of the Option groups parameter.
-
To check if this option group is in in-sync state, run the following describe-db-instances command:
aws rds describe-db-instances \--db-instance-identifier example-db-instance \--query 'DBInstances[*].[OptionGroupMemberships[*]]'
Note: Replace example-db-instance with your database instance.
-
Do one of the following:
If the option group isn't in the in-sync state, then choose the Logs & events tab and review the Recent events section. Identify the component that prevents the option group from being in-sync and then remove this component from the database.
-or-
If the option group is in the in-sync state, then check that the S3_INTEGRATION option is included in the option group. If the S3_INTEGRATION option isn't included in the option group, then add the S3 integration option.
Note: If you receive the error "ORA-00904: "RDSADMIN"."RDSADMIN_S3_TASKS"."UPLOAD_TO_S3": invalid identifier";", then make sure that you add the S3_INTEGRATION option to your database instance.
Downtime doesn't occur when you add the S3_INTEGRATION option. If you plan to use the Apply Immediately option, then be sure that there aren't pending maintenance actions. If there are pending maintenance actions, then those actions apply with S3 integration. For more information, see Maintaining a DB instance.
Upload and download files
You can't access the Amazon S3 files directly in the database instance. To access these files, you must copy the files to a directory in the Amazon RDS host. To upload and download files from the database instance to an S3 bucket, complete the following steps:
- Use the expdp command, the dbms_datapump API in the DATA_DUMP_DIR, or the custom directory to create the export files to Amazon S3.
- Use the rdsadmin.rdsadmin_s3_tasks.upload_to_s3 procedure to upload files to S3.
- Use the rdsadmin. rdsadmin_s3_tasks.download_from_s3 procedure to download files from the S3 bucket to the DATA_PUMP_DIR folder or a user-created directory.
When you run the rdsadmin package upload or download procedure, view the logs, and monitor the status of the file transfer. To view these logs, complete the following steps:
-
Open the Aurora and RDS console.
-
In the navigation pane, choose Databases.
-
Choose the database instance that you want to view.
-
Choose the Logs & events tab.
-
Under the Logs section, enter the procedures task ID in the search filter.
-
Select the file that's returned.
-
Choose Watch to view the logs or choose Download to download the logs.
-
To read the log file generated during the upload or download process, run the following select statement:
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-task-id.log'));
Note: Replace task-id with the task ID that the procedure returns.
Related information
Amazon S3 integration
Why does my Amazon RDS for Oracle DB instance use more storage than I expect?