By using AWS re:Post, you agree to the Terms of Use

Questions tagged with Amazon Kinesis Data Analytics

Sort by most recent
  • 1
  • 2
  • 12 / page

Browse through the questions and answers listed below or filter and sort to narrow down your results.

Kinesis Analytics for SQL Application Issue

Hello, I am having trouble to properly handle query with tumbling window. My application sends 15 sensor data messages per second to Kinesis Data Stream, which is used as an input stream for Kinesis Analytics application. I am trying to run an aggregation query using a GROUP BY clause to process rows in a tumbling window by 60 second interval. The output stream then sends data to a lambda function. I expect that the messages should arrive at lambda every 60 seconds but instead, they arrive much faster, almost every second, and the aggregations don't work as expected. Here is the CloudFormation template that I am using: ApplicationCode: CREATE OR REPLACE STREAM "SENSORCALC_STREAM" ( "name" VARCHAR(16), "facilityId" INTEGER, "processId" BIGINT, "sensorId" INTEGER NOT NULL, "min_value" REAL, "max_value" REAL, "stddev_value" REAL); CREATE OR REPLACE PUMP "SENSORCALC_STREAM_PUMP" AS INSERT INTO "SENSORCALC_STREAM" SELECT STREAM "name", "facilityId", "processId", "sensorId", MIN("sensorData") AS "min_value", MAX("sensorData") AS "max_value", STDDEV_SAMP("sensorData") AS "stddev_value" FROM "SOURCE_SQL_STREAM_001" GROUP BY "facilityId","processId", "sensorId", "name", STEP("SOURCE_SQL_STREAM_001".ROWTIME BY INTERVAL '60' SECOND); KinesisAnalyticsSensorApplicationOutput: Type: "AWS::KinesisAnalytics::ApplicationOutput" DependsOn: KinesisAnalyticsSensorApplication Properties: ApplicationName: !Ref KinesisAnalyticsSensorApplication Output: Name: "SENSORCALC_STREAM" LambdaOutput: ResourceARN: !GetAtt SensorStatsFunction.Arn RoleARN: !GetAtt KinesisAnalyticsSensorRole.Arn DestinationSchema: RecordFormatType: "JSON" I would really appreciate your help in pointing what I am missing. Thank you, Serge
0
answers
0
votes
28
views
asked 3 months ago

Why Records: [] is empty when i consume data from kinesis stream by python script?

i am trying to consume data using python script from kinesis data stream which is created successfully and data is produced or streamed to it successfully , but when running consumer script in python : ``` import boto3 import json from datetime import datetime import time my_stream_name = 'stream_name' kinesis_client = boto3.client('kinesis', region_name='us-east-1') response = kinesis_client.describe_stream(StreamName=my_stream_name) my_shard_id = response['StreamDescription']['Shards'][0]['ShardId'] shard_iterator = kinesis_client.get_shard_iterator(StreamName=my_stream_name, ShardId=my_shard_id, ShardIteratorType='LATEST') my_shard_iterator = shard_iterator['ShardIterator'] record_response = kinesis_client.get_records(ShardIterator=my_shard_iterator, Limit=2) while 'NextShardIterator' in record_response: record_response = kinesis_client.get_records(ShardIterator=record_response['NextShardIterator'], Limit=2) print(record_response) # wait for 5 seconds time.sleep(5) ``` But the output of the message data is empty ('Records': []): ``` {'Records': [], 'NextShardIterator': 'AAAAAAAAAAFFVFpvvveOquLUe7WO9nZAcYNQdcS6f6a+YGrrrjZo1gULcu/ZYxC7AB+xVlUhgL9UFPrQ22qmcQa6iIsmuKWl26buBk3utXlVqiGuDUYSgqMOtkp0Y7pJwa6N/I0fYfl2PLTXp5Qz8+5ZYuTW1KDt+PeSU3992bwgdOm7744cxcSnYFaQuHqfa0vLlaRBTOACVz4fwjggUBN01WdsoEjKmgtfNmuHSA7s9LLNzAapMg==', 'MillisBehindLatest': 0, 'ResponseMetadata': {'RequestId': 'e451dd27-c867-cf3d-be83-edbe95e9da9f', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': 'e451dd27-c867-cf3d-be83-edbe95e9da9f', 'x-amz-id-2': 'ClSlC3gRJuEqL9YJcHgC2N/TLSv56o+6406ki2+Zohnfo/erFVMDpPqkEWT+XAeeHXCdhYBbnOeZBPyesbXnVs45KQG78eRU', 'date': 'Thu, 14 Apr 2022 14:23:21 GMT', 'content-type': 'application/x-amz-json-1.1', 'content-length': '308'}, 'RetryAttempts': 0}} ```
2
answers
0
votes
118
views
asked 5 months ago

KDA Studio App keep throwing glue getFunction error, but I didn't use any glue function

I followed [this AWS blog post](https://aws.amazon.com/blogs/aws/introducing-amazon-kinesis-data-analytics-studio-quickly-interact-with-streaming-data-using-sql-python-or-scala/) to create KDA app, and change the output sink into s3 instead of data stream, everything is working, and I can get the result in s3. However in the KDA error logs, glue keep throwing getFunction error almost every second I run the deployed app, but I only use glue to define input/output schemas, didn't use any glue function, so I wonder where is it come form. Please help to take a look. ``` @logStream kinesis-analytics-log-stream @message {"locationInformation":"com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.getFunction(GlueMetastoreClientDelegate.java:1915)","logger":"com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate","message":"software.amazon.kinesisanalytics.shaded.com.amazonaws.services.glue.model.EntityNotFoundException: Cannot find function. (Service: AWSGlue; Status Code: 400; Error Code: EntityNotFoundException; Request ID: <Request ID>; Proxy: null)","threadName":"Thread-20","applicationARN":<applicationARN>,"applicationVersionId":"1","messageSchemaVersion":"1","messageType":"ERROR"} @timestamp <timestamp> applicationARN <applicationARN> applicationVersionId 1 locationInformation com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.getFunction(GlueMetastoreClientDelegate.java:1915) logger com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate message software.amazon.kinesisanalytics.shaded.com.amazonaws.services.glue.model.EntityNotFoundException: Cannot find function. (Service: AWSGlue; Status Code: 400; Error Code: EntityNotFoundException; Request ID:<Request ID>; Proxy: null) messageSchemaVersion 1 messageType ERROR threadName Thread-20 ``
0
answers
1
votes
29
views
asked 5 months ago
  • 1
  • 2
  • 12 / page