S3 bucket access on EC2 Instance using boto3

0

I have a flask application that needs to access some configuration files stored in s3 and I don't want to stored these in my git repo as there are sensitive information. I want to use boto3 to read in these files into the ec2 instance.

I need to know how I will be able to access those files on s3 using boto3. The ec2 instance is authorized to access the s3 bucket and since the application will be running on the ec2 (that is authorized) is it safe to say that any code run on that instance will also have these access. Note that I am going to run these script to read in these "config files" from my codedeploy "appspec.yml".

In summary will my codeploy instance be able to use the authorization on my ec2 instance to access the said s3 bucket or I need to give the codedeploy instance access too? I am aware I can add my access keys in my environment on my computer to run this script but how does it work on codedeploy?

1 Answer
1
Accepted Answer

CodeDeploy is using a service role. This service role must have read access to the S3 bucket/object (and it's KMS key, if used) as well as permission to your EC2 instance. So, in the scenario you mentioned, the credential of your EC2 role will not be used to access the S3 object.

profile pictureAWS
jputro
answered 2 years ago
  • Thought as much. I will test it out and let you know.

  • I was able to add the needed permissions as you advised.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions