Unable to stream data from Firehose to Snowflake

0

I'm trying to configure a streaming pipeline using Amazon Firehose with the new Snowflake integration. Both the AWS and Snowflake accounts are in us-east-1 region.

I have an existing integration using Firehose and staging files and the architecture looks lite this:

[Kinesis Data Stream] -> [Amazon Data Firehose] -> [S3] <- [Snowpipe + SNS integration] -> [Snowflake target table]

I'd like to integrate the new architecture like this:

[Kinesis Data Stream] -> [Amazon Data Firehose] -> [Snowflake target table]

I have configured a new Firehose instance using a private link url, a VPCE ID, and a custom Snowflake user and role I created with permissions to insert data into the target table. The Snowflake user is configured with a password and a private/public key (I have followed the steps described in Snowflake quickstart guide).

In AWS no error was detected when creating the instance. However I have 2 main issues:

  1. My new Firehose instance does not read any record from Kinesis Data Stream and is not publishing any record to Snowflake. I have no data related to delivery error, like if absolutely nothing is happening. The existing Firehose is still reading records from KDS, and AWS documentation states multiple Firehose instances can be linked to the same KDS. I just don't understand why I'm not getting any records.
  2. I deleted the Firehose instance and created a new one using Direct Put as source. I published a few test records to the Firehose stream and I get the following error message: An internal error occurred when attempting to deliver data. Delivery will be retried; if the error persists, it will be reported to AWS for resolution. Once again no records were sent to Snowflake.

Reading some documentation, I found out the table column names are case sensitive. I editied the JSON records to use UPPERCASE key names that match the target table column names, and retried the streaming. The same "internal error" occurs.

Is there anything I'm missing here ? In Snowflake side, I confirm the user and role have access to write data in the target table. The Private link and VPCE ID are in US-EAST-1 region and the Snowflake account is an enterprise level account.

Thanks

Marius
asked 2 months ago101 views
No Answers

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions