Browse through the questions and answers listed below or filter and sort to narrow down your results.
0
answers
0
votes
4
views
asked 2 months ago
1
answers
0
votes
11
views
asked 3 months ago
1
answers
0
votes
9
views
asked a year ago
[IoT Core][Kinesis] How much ingestion stream to determine Kinesis or IoT Core?
Need to compare preferred choice between Kinesis Firehose and IoT Core for streaming data ingestion.
From my observation,
* IoT Core is more suitable to infrequent, bi-directional and network limited IoT data ingestion from many IoT devices.
* Kinesis Firehose is more suitable for ingesting constant and quite big stream of data to AWS cloud.
Then my question is "how big"?
For example, if a customer's IoT end device is making 1MB/s or 5MB/s (or even arbitrary X MB/s) of read-only constant sensor data stream which needs to be ingested to AWS Cloud. Should they consider IoT Core or Kinesis Firehose/Data-Stream?
Let's assume that the data to be sent will be quite well JSON formatted and AWS Cloud will save that data to S3 directly.
What's the threshold X MB/s value to determine whether to use IoT Core or Kinesis?
Thanks!
Accepted AnswerAmazon Kinesis
1
answers
0
votes
5
views
asked a year ago
Why so many shardIds when I've only configured 3 in my Kinesis Stream?
I have Kinesis consumer code that does a DescribeStream and then spins up a new Java thread per shardId to consume off each shard.
I get 8 shardIds when I've only configured 3 in my Stream. Why is that? I don't want to have 5 extra threads consuming constantly and getting zero records. Below, you can see I'm logging the total # of records processed on each shard.
2020-11-19 08:59:49 INFO GetRecords:109 - # Kinesis consumers: 8
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000000', Total Records: 0
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000001', Total Records: 0
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000002', Total Records: 0
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000003', Total Records: 19110
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000004', Total Records: 0
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000005', Total Records: 0
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000006', Total Records: 18981
2020-11-19 08:59:49 INFO GetRecords:112 - Kinesis - ShardId: 'shardId-000000000007', Total Records: 16195
**Background:** I started with 1, then configured 2, then, 3. Does this have something to do with the other shardIds that have 0 records? If so, what is the recommended code/practice to ignore a certain type of shard?
Accepted AnswerAmazon Kinesis
1
answers
0
votes
10
views
asked a year ago
Hide AWS access key from source code
I'm following tutorial from
https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/hls-playback.html#how-hls-ex1
It works great! But the access key is visible to user in the source code. How can I go about hiding access key from the user?
Tutorial puts it visible in the source code:
var options = {
accessKeyId: $('#accessKeyId').val(),
secretAccessKey: $('#secretAccessKey').val(),
sessionToken: $('#sessionToken').val() || undefined,
region: $('#region').val(),
endpoint: $('#endpoint').val() || undefined
}
var kinesisVideo = new AWS.KinesisVideo(options);
var kinesisVideoArchivedContent = new AWS.KinesisVideoArchivedMedia(options);
I'm using Django web server.
Accepted AnswerAmazon Kinesis
1
answers
0
votes
1
views
asked 2 years ago
Connectivity to external SMTP server
I want to be able to import all emails being received on an external SMTP server for an externally managed domain(say waffles.com).
1. How do I ingest all the emails being received to all users of that waffles.com in my AWS S3 bucket? Assume that I am able to create a new user in that SMTP server with Admin right - how do I create a connection to that SMTP server so that all emails can be dumped in my bucket for post processing?
Really appreciate if anyone can share any insights on what services to use, or any other idea that can help me achieve this. Thanks.
Accepted AnswerAmazon Kinesis
2
answers
0
votes
2
views
asked 3 years ago
Kinesis data streams limits
Hi everyone!
Could someone explain me this related to kinesis DS limits found in the documentation:
[https://docs.aws.amazon.com/streams/latest/dev/service-sizes-and-limits.html][1]
1. GetRecords can retrieve up to 10 MiB of data per call from a single shard, and up to 10,000 records per call. Each call to GetRecords is counted as one read transaction.
2. Each shard can support up to five read transactions per second. Each read transaction can provide up to 10,000 records with an upper limit of 10 MiB per transaction.
3. Each shard can support up to a maximum total data read rate of 2 MiB per second via GetRecords. If a call to GetRecords returns 10 MiB, subsequent calls made within the next 5 seconds throw an exception.
I find these points not very consistent and maybe contradictory. Did I miss something?
I appreciate your help.
[1]: https://docs.aws.amazon.com/streams/latest/dev/service-sizes-and-limits.html
Accepted AnswerAmazon Kinesis
1
answers
0
votes
11
views
asked 3 years ago
I have question on KCL(Kinesis Customer Library)
A customer currently use Kinesis (I guess data stream) in their own IoT service.
Customer do believe that KCL recorded how handling data when EC2 is being scaled in or terminated.
(There is a possibility of terminating EC2 when the data has been handled)
**Q1)** They wonder how does this record handle?
Or
**Q2)** They have to set lifecycle hook when EC2 is being scaled in or there is something that KCL managed this record by itself?
Customer would like to have any example or practice.
Accepted AnswerAmazon Kinesis
1
answers
0
votes
4
views
asked 3 years ago
Kinesis data stream - data transfer charges for on-premises consumer/producer
A customer is asking how egress data from Kinesis data stream to his on-premises consumer is charged.
The [Kinesis pricing page][1] indicates: "Data transfer is free. AWS does not charge for data transfer from your data producers to Amazon Kinesis Data Streams, or from Amazon Kinesis Data Streams to your Amazon Kinesis Applications."
Based on this I would assert that all data transfer to and from Kinesis is free. I am somewhat vary though that this may not apply across regions and/or from on-premises.
Can someone please clarify for me, while I go back to the customer to get a better understanding of their approach. Having remote producers and applications may inject complete different challenges altogether.
[1]: https://aws.amazon.com/kinesis/data-streams/pricing/
Accepted AnswerAmazon Kinesis
1
answers
0
votes
6
views
asked 4 years ago
Kinesis streams for low-latency apps
A customer is interested in re-factoring their credit-card processing app, and they want to use "streams" to glue the various parts together. They are considering either Kinesis or Kafka. Besides the fact that Kinesis is not PCI compliance (client side encryption), they are concerned about the 5 read TPS limit per shard. That seems somewhat low for a real-time app.
The customer only has 100-200 ms time budget to process the transaction. While I know that Kinesis can pull multiple records per read, this implies a potential 200ms gap between reads. While not a lot, this may be a show-stopper for my customer. Is there any way to make Kinesis more responsive (higher read TPS), or is this gap a non-issue in terms of "real-world" latency?!
Accepted AnswerAmazon Kinesis
1
answers
0
votes
3
views
Kinesis fan-out
A customer is looking to build a message processing system using Kinesis. This relative low volume (so only one shard will be needed), but low-ish latency is important (<1 sec end-to-end). Currently, there would be four consumers of the stream. However, the expectation is that more consumers would be added in the future.
Do we have best practice guidelines on fanning-out Kinesis streams to multiple consumers?
Accepted AnswerAmazon Kinesis
1
answers
0
votes
5
views
Kinesis - tools to operationally manage streams
Are there any third-party tools that customers can use to operationally manage Kinesis streams?
For example, if a customer has dozens of Kinesis streams, outside of using AWS Console and scripting against CloudWatch, does a graphical tool exist to help with the operational burden of managing multiple Kinesis streams.
Accepted AnswerAmazon Kinesis
1
answers
0
votes
2
views
asked 6 years ago