static website content does not update

1

I have a static website hosted on s3 using cloudfront. Domain hosted on Route53. My initial deployment worked fine and the app served at the domain. But when I upload changes to s3, the modified files do not serve. Files on s3 are definitely updating. So it appears that cloudfront is caching and serving the old files.

I found these articles: https://repost.aws/questions/QUfh2n4klfRHuzKpGJsjOkNA/old-s3-content-being-served-help and https://stackoverflow.com/questions/30154461/aws-cloudfront-not-updating-on-update-of-files-in-s3

Have tried a) creating an invalidation in my cloudfront distribution, b) creating a cache policy with 0 min ttl / 10 max ttl and applied to my cloudfront default behavior, c) adding Cache-Control headers to my s3 objects with max-age=1 (by updating the metadata on all the files). None of these approaches correct the problem.

FWIW, I have another distro set up on a different domain - exactly the same set up. Never had any issues with that one. The only difference is that the domain for that one is hosted by a third party, not Route53...

Any thoughts would be appreciated.

asked a year ago2019 views
2 Answers
0

It is possible to delete the CloudFront cache using the method described in the following document.
https://repost.aws/knowledge-center/cloudfront-clear-cache

profile picture
EXPERT
answered a year ago
  • Tried this, twice. Used /* to be sure to delete everything. No joy...

  • Just to be sure, delete your browser cache.

  • Ya, I had tried that as well - to no avail. What finally did help was deleting all the files from my s3 bucket (via the console) and then re-deploying my site. That got the latest version to serve. Further, upon deploying minor changes, they served as well. Obviously I lost the ability to fall back to earlier versions of the app that way, which is not cool, but at least I'm unstuck. Not sure what happened, but hope this gives insight to anyone else with similar issues.

0

I also encountered this same issue and tried changing the cache-control max-age to 0 for an individual file that I changed, but Cloudfront still served up the old file. I followed @rePost-User-2827836 suggestion to delete all the files from S3 and re-upload the files from my computer, and Cloudfront was able to serve the updated site files.

Not sure, but perhaps just deleting the file I changed and uploading that particular file again to S3 would've worked.

I would like to know more about the mechanisms behind how Cloudfront pulls files from the s3 bucket (are old copies of files stored somewhere even when overwritten?) to cache on the edge network.

profile picture
answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions