By using AWS re:Post, you agree to the Terms of Use
/Kinesis Transformation Buffering from Data Stream/

Kinesis Transformation Buffering from Data Stream

0

Hi, the current pipeline I am implementing involves a Kinesis Data Stream -> Kinesis Delivery Stream -> S3 route, however when setting the buffer intervals (Both s3 and transformation) to near 5 minutes, data still rapidly appears within the S3 bucket.

For my purposes I need the data for every 5 minutes to be combined into a single large file to be pushed into S3, which I was under the assumption Kinesis delivery streams should handle under the hood when taking in from a Kinesis data stream.

I'd appreciate any help in pointing out where my implementation could potentially be going astray :).

2 Answers
0
Accepted Answer

Issue resolved for anyone interested, by switching to direct PUTs into the delivery stream the files are properly aggregated.

However I was not able to find a reason for why the data stream -> delivery stream transition does not result in proper aggregation of data, although I suspect it may have to do with the data stream shards.

answered a month ago
0

Kinesis firehose delivery stream has two options for buffering the data, Buffer Size and Buffer Interval. If buffering data exceed the Buffer Size, the data could be delivered to S3 in previous to Buffer Interval. What value is Buffer Size set?

https://docs.aws.amazon.com/firehose/latest/dev/basic-deliver.html#frequency

The frequency of data delivery to Amazon S3 is determined by the Amazon S3 Buffer size and Buffer interval value that you configured for your delivery stream. Kinesis Data Firehose buffers incoming data before it delivers it to Amazon S3. You can configure the values for Amazon S3 Buffer size (1–128 MB) or Buffer interval (60–900 seconds). The condition satisfied first triggers data delivery to Amazon S3. When data delivery to the destination falls behind data writing to the delivery stream, Kinesis Data Firehose raises the buffer size dynamically. It can then catch up and ensure that all data is delivered to the destination.

answered a month ago
  • The size of the file never exceeds the max buffer size, which is anyways set to max (128)

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions