AWS Redshift Serverless execute Python script to API Call within a function/stored procedure to be query scheduled

0

Hi,

Setting up a zero code ETL environment using Redshift mostly I will be setting up Scheduler to ingest data from FDW RDS but for API I would like a SP/Function to execute Python scripts that do an API call and write a destination to the redshift table that can be query editor v2 scheduled?

Please Help

Regards

IC
已提問 3 個月前檢視次數 690 次
2 個答案
0

To set up a zero code ETL environment to ingest data from an API into Redshift using Lambda, here is what you can do: -Create a Lambda function using Python runtime. The function code should make an API call, parse the response and insert the data into a Redshift table. -Grant the Lambda execution role permissions to call the API, insert into Redshift and any other dependent services. -Create an API Gateway REST API and integrate it with the Lambda function using Lambda proxy integration. -Configure the API Gateway stage for public access. -Schedule the API Gateway endpoint URL using EventBridge or CloudWatch Events to trigger the Lambda function on a periodic basis.

This allows ingesting API data into Redshift in a serverless manner without having to write any ETL code. The API Gateway endpoint acts as the trigger that executes the Lambda function which fetches data and loads it into Redshift.

profile picture
專家
已回答 3 個月前
0

Thank You for assisting.

Can you kindly provide links/example to follow the setup?

IC
已回答 3 個月前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南