IoT Architecture recommendation - switch from Azure to AWS

0

Hi guys,

I rode a lot of articles and watch some videos about IoT Architect solutions, I did some proofs of concept too.
Now i want to start building the real system but i'm not quite sure about the architecture.

Hopefully the community can give me some tips and architecture advises

Devices number : 1000-2000 devices split between different customers
Some devices are sending small messages with telemetry data like temperature,pressures,... (every hour)
Some devices are sending big binary files with raw data (50mb) (every 6hours)
this big files are formated like that:

device_id,published_at,[value,value,value,value,value,value,value,value,value,value,value,value,value,value,...]
device_id,published_at,[value,value,value,value,value,value,value,value,value,value,value,value,value,value,...]
device_id,published_at,[value,value,value,value,value,value,value,value,value,value,value,value,value,value,...]
device_id,published_at,[value,value,value,value,value,value,value,value,value,value,value,value,value,value,...]
device_id,published_at,[value,value,value,value,value,value,value,value,value,value,value,value,value,value,...]
.....

I was previously working with Azure and i had this architecture that was working fine :
-IoT Device Provisioning Service
-IoT Hub for the small telemetry messages
-Azure SQL DB to store my small telemetry messages
-Stream Analytic Service to get the message from the IoT Hub and store in the SQL DB
-Blob storage to store my big binary files with the right naming policy DeviceName/Date/Filename
-Stream Analytic Service to read my newly uploaded blob and give an input to other jobs

But for some reasons we have to switch to AWS and since it's a different approach im asking for your knowledge.
Here is what i have so far:
-IoT Device Management (to manage my devices and groups)
-S3 to store my binary files
-DynamoDB
-AWS Lambda

The flow :
1 - IoT devices get registered through IoT Device Mangement
2 - Devices send telemetry data every hour Telemetry data are stored in DynamoDB through Lambda
3 - Devices send Big binary files every 6hours through MQTT protocol on S3
4 - When a big binary file upload is finish, Lambda gets triggerred and parse this file to store the data on DynamoDB (then delete the binary file?)
5 - Then when i have all the data in DynamoDB i can trigger whatever cloud function...

I do think that i am not getting it properly. If someone can give me some advice on how to do what i was doing on Azure on AWS?

Any recommendation is really appreciated !
Thank you

Edited by: Doombqr on Jan 24, 2019 7:45 AM

Doombqr
질문됨 5년 전224회 조회
2개 답변
0

Hey Doombqr,

You are on the right track with AWS. Your telemetry should be going over MQTT via AWS IoT Core and then typically you send larger payloads directly to something like S3 for down-stream processing.

Your flows would be:

Device Management: AWS IoT Device Management etc (as you already had)

Telemetry: Device->MQTT->IoT Core->Rule->Lambda->DDB or if you are not doing anything in the Lambda then go directly to DDB, so Device->IoT Core->Rule->DDB.

Larger Files: Device->HTTPS->S3->Lambda->....

The reason you want to send larger objects directly to S3 is so you don't exceed hard limits within IoT Core over MQTT such as 128k payload limits etc.

Let me know if that helps!

-c

AWS
답변함 5년 전
0

Hi Craig!

Thanks for the tip about MQTT payload !

I'm gonna try what you said tonight !

Doombqr
답변함 5년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인