By using AWS re:Post, you agree to the Terms of Use

Cloudfront X S3 website + secondary bucket for specific media


Hi everybody, I have a specific need I'm struggling to make it work.

I have a Cloudfront + OAI configured to serve a S3 bucket with a static website. So far so good, the OAI policy in the main bucket is working as expected.

BUT the business requires us to provide some training videos which they don't want to put in the same bucket as the website because it isn't supposed to be static.

I updated the secondary bucket (the one that contains the media) with the same policy as the main bucket (OAI permission) but I'm still getting access denied when trying to load the content.

What is stuck in my head is: If I update the policy to allow access to the secondary bucket from an IP source, the developer can load the content using the S3 URL. If I remove the IP policy and leave only the OAI, the webpage gives 403 for the content. I tried searching the whole CloudFront and the S3 documentation but I can't find a scenario where this happens.

One of the suggestions I saw is to create another CloudFront distribution to serve the secondary bucket but I lack the development knowledge on how to serve the media without creating a whole webpage. Have you ever been through this situation? Are you aware of any other policy I can set up on the secondary bucket to allow my main bucket + OAI to access the media using the S3 URL? I tried using the pre-signed URL solution but I couldn't find a way for the developer to generate and provide this content in the webpage. (I've seen how to do it, I just don't know how to do it taking into consideration the business need).

Thank you for your time and help. I appreciate it.

3 Answers
Accepted Answer

Hi Gabriel

I work with CloudFront and S3 and I would like to assist you to resolve this issue.

Firstly, your use case is widely used, so there is nothing wrong there.

Generally , S3 returns a 403 when either the incorrect permissions are on the bucket, or you are trying to access a path/file that does not exist. Using OAI is also the best way to access S3 bucket, while keeping the bucket private.

So to do this, make sure your settings are as below:

S3 Bucket

  1. Make sure your origin is an S3 origin type, and not a custom origin. This will be for the static bucket and the media bucket.
  2. S3 Bucket Access, select your OAI, and let CloudFront update your bucket policy for you.
  3. Save the changes.


For the behavior, I am assuming your static site is on the root of the bucket. That is fine.

  1. Make sure the media content is in a folder and not the root of the bucket. Example: bucket-name/mediafolder/mediafile.mp4
  2. Create a bahavior, set the behavior path = /mediafolder/*
  3. For the Origin, select the S3 media bucket.
  4. Select your protocol and allowed methods.
  5. If you would like to cache the content, select the CachingOptimized policy, otherwise select the CachingDisabled policy.
  6. For the Origin request policy, select the CORS-S3Origin. But if you are not doing CORS requests, then you can leave this as None and save the behavior.
  7. Setup the default behavior for all other requests, mainly for your static bucket.

Please note that when using S3, the URL and behavior path needs to match the S3 folder structure. If your URL is, then there needs to be a folder structure like that. bucketname/media/mediafile.mp4.

Let me know if the above works. If not, we can dive a bit deeper.

profile picture
answered 23 days ago
  • Thank you for the help BazzieB_AWS.

    The main problem was my mindset. I tried configuring the Origin just like you guided me but I kept using the S3 URL instead of "translating" the path to my CloudFront URL.

    After re-creating the origin and all the ruling, I managed to access the media from the CloudFront and the S3 URL is now blocked.

    Thank you for the help.


You can place the media files in the same bucket as the web site, however, you add an additional Behavior to the CloudFront Distribution with a specific PathPattern like "*.mpeg" and use the CachingDisabled policy so that the videos are treated as dynamic content.

profile picture
answered 24 days ago
  • Hi kentrad, thank you for stopping by to help.

    If we only had a static website I wouldn't see a problem on providing the media in the same bucket but the main problem we had doing it it's the media size. It isn't recommended (as far as I know) to store such big blob files in our GitHub repositories. It hugely increased our deploy and testing.

    In the last case scenario we will update the bucket policy and make the content public, but I just want to be sure I covered every solution before going for this approach.

  • I don't understand why the media files would have to be stored in the same repository as your website code? Why not just copy the media files from their source into the same bucket where you deploy the website?


Hello Gabriel,

Full disclosure, I am not an S3 support engineer. I found the query interesting.

I am not sure if you have tried CORS yet. If not, you can find more details here:

Maybe try allowing your domain and the cloudfront's domain to access the secondary bucket and use the URL to the objects in your code?

"AllowedOrigins": [

Please let me know if it works. If not, maybe I will try digging through a few documents.

Take care and stay safe !

profile picture
answered 24 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions