Need suggestion to Automate the task to convert glb file into usdz by using docker command in EC2 instance.

0

We have implemented a docker on EC2 instance (i-05d**** (ARubntu)) for the purpose of conversion of GLB files to USDZ. It is implemented properly and we are also able to use it from the EC2 command line. But we want to give this feature to convert the file to our users on our webpage for which they will first upload the GLB file(this we have done successfully) but now we want to implement the conversion function on the webpage for which we don't have any idea and we need help with that.

  1. First step file uploading on s3 bucket in our case bucket name is bucket_name (ap-south-1)
  2. Second step is to convert this .glb file into .usdz (manually by using this docker command it is successfully uploaded to the same bucket--->

docker run -e INPUT_GLB_S3_FILEPATH='bucket_name/10_Dinesh/8732f71f6eca07050f62b014354c5/model.glb'
-e OUTPUT_USDZ_FILE='model.usdz'
-e OUTPUT_S3_PATH='bucket_name/10_Dinesh/8732f71f6eca07050f62b014354c5'
-e AWS_REGION='ap-south-1'
-e AWS_ACCESS_KEY_ID='AKIA6N3W****'
-e AWS_SECRET_ACCESS_KEY='0GuRz3b1X8****'
-it --rm awsleochan/docker-glb-to-usdz-to-s3

by using the above command we can get the .usdz file in particular s3 bucket 3. Now we want to automate that task each time whenever any user uploads a .glb file in the s3 bucket it should give a .usdz file as well in the same bucket.

Does anyone have a solution for this? We just want the object path automatically updated in this command.

2 Answers
2

Hi,

instead of using plain EC2, I'd suggest to use ECS (elastic container service) and integrate it with S3 upload through EventBridge. Here is small tutorial on how to do that: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-ecs-tutorial.html Though it misses one important point - you don't know input file name, but it can be achieved using input transformer of Event Bridge and container overrides of ECS like this (it's a part of CloudFormation EventBridge Rule resource):

      Targets:
        - Arn: !GetAtt ECSCluster.Arn
          EcsParameters:
            LaunchType:  FARGATE
            NetworkConfiguration:
              AwsVpcConfiguration:
                AssignPublicIp: DISABLED
                SecurityGroups:
                  - !Ref SecurityGroup
                Subnets: !Ref PrivateSubnetIds
            TaskDefinitionArn: !Ref TaskDefinition
          Id: !Join [ '-', [ !Ref 'AWS::StackName', s3-upload ] ]
          InputTransformer:
            InputPathsMap:
              S3Key: $.detail.requestParameters.key
            InputTemplate: !Sub |
              {
                "containerOverrides": [
                  {
                    "name": "${AWS::StackName}",
                    "environment": [
                      { "name": "S3_KEY", "value": <S3Key> }
                    ]
                  }
                ]
              }
          RoleArn: !GetAtt EventBusRuleRole.Arn

S3_KEY is name of your container env var which will contain S3 key of uploaded object

With this solution you don't need to write any custom code to handle upload itself (like you'd have to if you use Lambda/SQS/SNS)

answered 2 years ago
1

Hi,

There is an S3 feature called Event Notifications.

This will allow you to publish events when a user uploads an object to your bucket. The event notifications can be sent to Lambda, SNS or SQS, the event information will include the S3 bucket and Key, which could be used to generate the file path input for your container. More information on event notifications here:

https://docs.aws.amazon.com/AmazonS3/latest/userguide/enable-event-notifications.html

AWS
Tom-B
answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions