S3 bucket access on EC2 Instance using boto3

0

I have a flask application that needs to access some configuration files stored in s3 and I don't want to stored these in my git repo as there are sensitive information. I want to use boto3 to read in these files into the ec2 instance.

I need to know how I will be able to access those files on s3 using boto3. The ec2 instance is authorized to access the s3 bucket and since the application will be running on the ec2 (that is authorized) is it safe to say that any code run on that instance will also have these access. Note that I am going to run these script to read in these "config files" from my codedeploy "appspec.yml".

In summary will my codeploy instance be able to use the authorization on my ec2 instance to access the said s3 bucket or I need to give the codedeploy instance access too? I am aware I can add my access keys in my environment on my computer to run this script but how does it work on codedeploy?

1回答
1
承認された回答

CodeDeploy is using a service role. This service role must have read access to the S3 bucket/object (and it's KMS key, if used) as well as permission to your EC2 instance. So, in the scenario you mentioned, the credential of your EC2 role will not be used to access the S3 object.

profile pictureAWS
jputro
回答済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ