I suggest using your preferred trigger s3, eventbridge or sqs/sns to start a step function which can then start a fargate task (Which does not have 15min time limit). You can also add additional logic and error handling to the step function
What you are looking for is an equivalent of GCP's Cloud Run.
While I'm curious what others have to suggest, there is no exact Cloud Run equivalent in AWS.
There are couple of things -
- You can have a small Fargate task running, and scale up when requests comes in; or simply use App Runner. This approach while not giving you absolutely 0 cost when there is no request, the overhead is very small;
- You can also use AWS Batch, Lambda, SQS, Step Function etc. to process the upload in an asynchronous manner, this is the closest to what you want to achieve. AWS Batch is made for the use case you described, not sure why you believe it would not work for you;
- You also have the option to write your own logic, for example, use Lambda function to trigger a set of automation that creates let's say a new ECS task, and at the end of processing, trigger another Lambda function within the task that deletes the task.
Using Rekognition to process a batch of images on an S3 bucketasked 3 months ago
Lambda Upload a .zip File not workingasked 5 months ago
Invoke Lambda functions from S3 uploads at high scaleAccepted Answerasked 2 years ago
Transfer files (1GB to 2G) from web url to S3 on a scheduleAccepted Answerasked 3 years ago
I get a 'forbidden error' when attempting to upload images.asked 2 years ago
How to limit s3 mutlipart upload filesize?asked 5 months ago
AWS Service that can act like Lambda without a time limit?asked 6 months ago
How do I upload a checkpoint file to my Github repoasked 2 years ago
How to properly and completely terminate a multipart upload?asked 3 months ago
Where can I found the uri of my Bucket to connect my Symfony application with S3 and upload files?asked 6 months ago