By using AWS re:Post, you agree to the Terms of Use

Questions tagged with Amazon Kinesis Data Streams

Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

Error when trying to parse OpenTelemetry 0.7.0 data from metric streams

I'm trying to parse open telemetry data from metrics stream and sent to s3 bucket, without any compressions. I downloaded the files from s3 (binary), they look like ``` ¹— µ— Ô  cloud.provider aws " cloud.account.id 935705392901  cloud.region us-west-2 x aws.exporter.arnd ...........// ARN... ê  NamespaceAWS/NetworkELB  MetricNameHealthyHostCount = ``` hex: ``` 00000000: aec2 020a aac2 020a d401 0a17 0a0e 636c ..............cl 00000010: 6f75 642e 7072 6f76 6964 6572 1205 0a03 oud.provider.... 00000020: 6177 730a 220a 1063 6c6f 7564 2e61 6363 aws."..cloud.acc // skipped 00000050: 2e72 6567 696f 6e12 0b0a 0975 732d 7765 .region....us-we 00000060: 7374 2d32 0a78 0a10 6177 732e 6578 706f st-2.x..aws.expo 00000070: 7274 6572 2e61 726e 1264 0a62 6172 6e3a rter.arn.d.barn: // skipped 000000e0: c002 12e7 020a 2f61 6d61 7a6f 6e61 7773 ....../amazonaws 000000f0: 2e63 6f6d 2f41 5753 2f4e 6574 776f 726b .com/AWS/Network 00000100: 454c 422f 556e 4865 616c 7468 7948 6f73 ELB/UnHealthyHos ... ``` and I followed the doc https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-metric-streams-formats-opentelemetry-parse.html trying to parse it using javascript, but I keep getting an error ``` Error: Unknown base64 encoding at char: ¹ ``` from this line in the sample code from the doc ``` const reader = new pb.BinaryReader(data) ``` the `google-protobuf` I used is `3.19.3`, and `metrics_service_pb` is generated from proto that from the doc as well. Does anyone know how to properly parse the binary data with javascript? Thanks, Bill
1
answers
0
votes
33
views
asked 8 months ago

Help processing Kinessis Records with KCL and Java

How am I supposed to process the actual record in Java using KCL? I'm following the guidance provided https://docs.aws.amazon.com/streams/latest/dev/kcl2-standard-consumer-java-example.html, I can connect to the data stream, I can get the number of records available, however what the example doesn't show is how to actually get the record (Json string). From the example I can see that I can use `r.data()` to get the record's data, it comes as a read only `ByteBuffer`, I can convert this to string by using `StandardCharsets.US_ASCII.decode(r.data()).toString()`, however the resulting string is definitely encoded, I have tried doing Base64 decoding but I get error `java.lang.IllegalArgumentException: Illegal base64 character 3f`. So what is the simplest way to get the payload? Below is my `processRecords` method: ``` public void processRecords(ProcessRecordsInput processRecordsInput) { try { System.out.println("Processing " + processRecordsInput.records().size() + " record(s)"); processRecordsInput.records().forEach((r) -> { try { Decoder dec = Base64.getDecoder(); String myString = StandardCharsets.US_ASCII.decode(r.data()).toString(); byte[] bt = dec.decode(myString); } catch (Exception e) { e.printStackTrace(); } }); } catch (Throwable t) { System.out.println("Caught throwable while processing records. Aborting."); Runtime.getRuntime().halt(1); } finally { } } ``` From here I can get `myString` but when I get to `bt` I get the exception shown. I have not found a single resource explaining how to get the record. I post the record to kinesis using `aws kinesis put-record --stream-name testStream --partition-key 1 --data {"somedata":"This Data"}`
1
answers
0
votes
61
views
asked 9 months ago

DB Log Processing through Kinesis Data streams and Time Series DB

Hi Team, I have an architecture based question, How Postgre SQL DB log processing can be captured through AWS lambda , aws Kinisis Data streams and finally Data should loads into Time Stream Database. Providing High level scenario: Draft Data flow : **Aurora Postgre DB** ----DB Logs Processing to ---->** Lambda** --->Ingestion to ----> **Kinesis Data Streams ** ---Process and Join context data and insert --- Insert to --------> **Time Stream Database** I believe , we can process / loads the AWS IoT (sensors , device data) to Time Stream Database through Lambda , Kinesis Streams , Kinesis Data analytics and finally Time series Database and we can do analytics on time series data . But i am not sure How the postgre SQL db logs (write ahead logs) process through Lambda and ingest through Kinesis streams and finally load into Time Stream Database . and above flow also required to Joins some tables like Event driven tables with associated Account , Customer tables and then it will load into Time Series Database . would like to know if above flow would be accurate , as we are not processing any sensors / devices data ( where sensors data captures all measures and dimensions data from device and loads into Time Stream DB ) so Time Series database always a primary database . if anyone can through some lights , how postgre sql db logs can be integrated with Time Stream database through Kinesis Data streams , Lambda . Need your help Thanks
1
answers
0
votes
74
views
asked 9 months ago