ETL Data From Redshift to RDS

0

I am trying to create a daily ETL that reads data from Redshift and sends it to my RDS. I am using two JDBC connections. I manage to get the data from Redshift but when I try to push to RDS I get the "The connection attempt failed." error. Whenever I test the connection it succeeds.

Here is my code : `

Define Amazon Redshift connection and read data

AmazonRedshift_node1 = glueContext.create_dynamic_frame.from_options( connection_type="redshift", connection_options={ "sampleQuery": "SELECT vendor FROM shopify.order_line GROUP BY vendor", "useConnectionProperties": "true", "connectionName": "Redshift", }, transformation_ctx="AmazonRedshift_node1", )

Write data to RDS

RDS_node = glueContext.write_dynamic_frame.from_jdbc_conf( frame=AmazonRedshift_node1, catalog_connection="aurora-destination-connection", connection_options={ "dbtable": "vendor", "database": "etl_redshift_test" }, transformation_ctx="RDS_node", ) ` I do not want to use the aws catalog as I want to store data in my RDS.

I am stuck at this point and not sure how to proceed or even if Glue is the best option for this.

alex
質問済み 7ヶ月前234ビュー
1回答
0

This error seems to be due to connection timeout.

I see that you have two JDBC sources and you have two glue connections defined. When a glue connection is attached to an ETL job, it creates ENIs (Elastic Network Interface) in the subnet configured in the connection. This ENI is used to manage access to your resources in your VPC. When you have multiple such connections attached to the job, it creates ENIs from the first connection only. It doesn't create/attach ENIs from other glue connections.

With that said, is your Redshift and RDS both in same VPC or different VPC? If both are in same VPC, check if security group in first connection (I assume it is redshift since it is working) is allowed in RDS security group inbound rules.

If they are in different VPC, check if there is network path from one VPC to the other. This can be done using VPC peering. For further information on this setup, please refer to this doc - https://aws.amazon.com/blogs/big-data/connecting-to-and-running-etl-jobs-across-multiple-vpcs-using-a-dedicated-aws-glue-vpc/

AWS Glue catalog is a metadata catalog. It does not actually store the data. Therefore, in your code, you can use a glue table (write_dynamic_frame.from_catalog) that points to your RDS table or you can use connection details as you have done it in the code(write_dynamic_frame.from_jdbc_conf or from_options). Either way you would face this error if above network configuration is missing.

AWS
回答済み 7ヶ月前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ