How to build a data pipeline with one api gateway?

1

We have one API gateway that receive data 24x7 of gzipped meter data and the data come in concurrently(some times 5000 posts per second, sometimes not much), we are sure the compressed data won't excceed 10MB apigw limitation.

We have two goals:

  1. deliver the decompressed data into s3, but before that we need to do some rename and verification, e.g. we will only accept payload with correct decoded signature. So we have to use one lambda funciton. This one just store the data.
  2. Another goal is to ingest data to a lambda function and do some data processing and write to Timestream database.

Currently, we are using two lambdas, one store data, anotehr process and write to TimestreamDB.

Please provide us some way to more efficently do the job

1 Answer
1

It appears you have 3 main tasks in your flow of processing these incoming zip files:

  1. Accept incoming zip files at a high rate
  2. Decompress and validate each zip file
  3. Process validated files and push data to Timestream database

For greater efficiency I'd recommend the following:

  1. Use API Gateway direct integration to simply store each incoming zip file directly to an S3 bucket. This link contains patterns to explore, the first one (direct proxy) is an excellent choice:
    https://aws.amazon.com/blogs/compute/patterns-for-building-an-api-to-upload-files-to-amazon-s3/

  2. Setup the incoming S3 bucket trigger for new files to call a Lambda function. Create a second S3 bucket for Lambda to store validated and decompressed files. The Lambda could also delete the zip file after processing. Here is a link describing a similar flow: https://docs.aws.amazon.com/lambda/latest/dg/with-s3-tutorial.html

  3. The second S3 bucket can be configured with a trigger to call a second Lambda function to process the decompressed files and store into the Timestream database.

There are other options you could explore using S3 triggers to send events to SNS topics or SQS message queues as well: https://docs.aws.amazon.com/AmazonS3/latest/userguide/ways-to-add-notification-config-to-bucket.html

profile picture
answered 8 months ago
  • file create and delete put will cost a lot, how about processing and save to s3, then read and do other stuff

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions