Transfer from S3 to Redshift and filter specific key

0

Hey! My first task with AWS, so the question is more about the best way to solve the problem. The problem: I have integration from some service to export data to S3. After each export I need to transfer the data to Redshift and to filter some specific keys.

I checked those ways to do it:

  1. Lambda function to filter out the key
  2. Batch
  3. Transfer with airflow and filter the key by the list in the manifest

I’m checking now the 3rd option.

What do you think?

2 Answers
1

Hi, have you considered using Redshift Spectrum to import data into Redshift using SQL?

Assuming the data is in the format that can be read by Redshift Spectrum it would greatly simplify your ingestion pipeline.

You can schedule the query that does the import to automate the process https://docs.aws.amazon.com/redshift/latest/mgmt/query-editor-schedule-query.html

AWS
Alex_T
answered 2 years ago
1

You may want to consider AWS Glue, which is a serverless data integration service.

Load data from Amazon S3 to Amazon Redshift using AWS Glue - https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/load-data-from-amazon-s3-to-amazon-redshift-using-aws-glue.html

AWS
EXPERT
Hernito
answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions