Skip to content

Writing Flink SSQL to Firehose?

0

Hello!

I'm new to Flink, and have managed to set up a kinesis -> managed-flink pipeline very quickly! It's definitely a very cool system.

In my last paragraph in Flink, I'm using SSQL to generate a table:

%flink.ssql(type=update)

SELECT 
  window_start, 
  window_end, 
  instance_id, 
  content_id, 
  internal, 
  COUNT(*) AS total 
FROM 
  TABLE(
    TUMBLE(
      TABLE analytics_table, 
      DESCRIPTOR(event_time), 
      INTERVAL '1' MINUTES
    )
  )
GROUP BY 
  (
    window_start, window_end, instance_id, 
    content_id, internal
  );

The data being shown in the table is precisely what I want; but the challenge I have is that I'm not sure how to persist this data! How can I write it to something like S3 or Firehose?

Thanks!

asked a year ago164 views
1 Answer
0

Hi,

It looks like you are trying to send data from Kinesis Stream to S3 bucket using AWS Managed Service for Apache Flink. The below documentation provides a detailed example to achieve your exact use-case. Please follow the same: [+] https://docs.aws.amazon.com/managed-flink/latest/java/earlier.html#examples-s3:~:text=Amazon%20S3%20bucket-,In%20this%20exercise,-%2C%20you%20create%20a

AWS
SUPPORT ENGINEER
answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.