- Newest
- Most votes
- Most comments
Hi,
You have many many different options, so since you posted in the Greengrass forum I'll share a solution based on Greengrass. Greengrass has a component call Stream Manager (learn more at https://docs.aws.amazon.com/greengrass/v1/developerguide/stream-manager.html) which can be used to take in large amounts of data and upload it to the AWS cloud reliably. The documentation that I just linked is for Greengrass version 1, but Stream Manager is also available for Greengrass version 2 with the same features. You should use version 2 if you are just getting started with Greengrass.
Greengrass will run code which you write locally. Your code will then able to collect data and import it into Stream Manager. Once it is in Stream Manager you have several options, but one with quite a lot of flexibility would be to upload to Kinesis. Once your data is in the cloud in Kinesis, you can then use a AWS Lambda function to read the data from the stream and convert it as needed and write it into your Mongo database.
Hope that helps you with at least one possible solution.
Cheers,
Michael Dombrowski
Relevant content
- asked 7 months ago
- AWS OFFICIALUpdated 2 months ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago