S3 bucket access on EC2 Instance using boto3

0

I have a flask application that needs to access some configuration files stored in s3 and I don't want to stored these in my git repo as there are sensitive information. I want to use boto3 to read in these files into the ec2 instance.

I need to know how I will be able to access those files on s3 using boto3. The ec2 instance is authorized to access the s3 bucket and since the application will be running on the ec2 (that is authorized) is it safe to say that any code run on that instance will also have these access. Note that I am going to run these script to read in these "config files" from my codedeploy "appspec.yml".

In summary will my codeploy instance be able to use the authorization on my ec2 instance to access the said s3 bucket or I need to give the codedeploy instance access too? I am aware I can add my access keys in my environment on my computer to run this script but how does it work on codedeploy?

1개 답변
1
수락된 답변

CodeDeploy is using a service role. This service role must have read access to the S3 bucket/object (and it's KMS key, if used) as well as permission to your EC2 instance. So, in the scenario you mentioned, the credential of your EC2 role will not be used to access the S3 object.

profile pictureAWS
jputro
답변함 2년 전
  • Thought as much. I will test it out and let you know.

  • I was able to add the needed permissions as you advised.

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠