- 最新
- 投票最多
- 评论最多
Hi,
instead of using plain EC2, I'd suggest to use ECS (elastic container service) and integrate it with S3 upload through EventBridge. Here is small tutorial on how to do that: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-ecs-tutorial.html Though it misses one important point - you don't know input file name, but it can be achieved using input transformer of Event Bridge and container overrides of ECS like this (it's a part of CloudFormation EventBridge Rule resource):
Targets: - Arn: !GetAtt ECSCluster.Arn EcsParameters: LaunchType: FARGATE NetworkConfiguration: AwsVpcConfiguration: AssignPublicIp: DISABLED SecurityGroups: - !Ref SecurityGroup Subnets: !Ref PrivateSubnetIds TaskDefinitionArn: !Ref TaskDefinition Id: !Join [ '-', [ !Ref 'AWS::StackName', s3-upload ] ] InputTransformer: InputPathsMap: S3Key: $.detail.requestParameters.key InputTemplate: !Sub | { "containerOverrides": [ { "name": "${AWS::StackName}", "environment": [ { "name": "S3_KEY", "value": <S3Key> } ] } ] } RoleArn: !GetAtt EventBusRuleRole.Arn
S3_KEY
is name of your container env var which will contain S3 key of uploaded object
With this solution you don't need to write any custom code to handle upload itself (like you'd have to if you use Lambda/SQS/SNS)
Hi,
There is an S3 feature called Event Notifications.
This will allow you to publish events when a user uploads an object to your bucket. The event notifications can be sent to Lambda, SNS or SQS, the event information will include the S3 bucket and Key, which could be used to generate the file path input for your container. More information on event notifications here:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/enable-event-notifications.html
相关内容
- AWS 官方已更新 1 年前
- AWS 官方已更新 2 年前