- Más nuevo
- Más votos
- Más comentarios
Yes - files are not deleted automatically from S3 as per tutorial Note: "Even if you delete files from the source repository, the S3 deploy action does not delete S3 objects corresponding to deleted files."
Depending on the availability requirements of the website (need more details).
Option 1:
Assumption: you publish the whole website with all the documents every time.
The best way to handle this situation - is to cleanup the bucket before each deploy. You could leverage "build stage" that you skipped in the tutorial - to issue the [AWS CLI S3 command]: (https://docs.aws.amazon.com/cli/latest/reference/s3/) aws s3 ...
to cleanup the bucket.
Tip: You could improve the smoothness of the process if you enable CloudFront with caching of the S3 as origin for website content. In this case during "website" cleanup on S3 - CloudFront still have the cached version, and users can access that, but after TTL for cache expires on CF - it'll route to S3 origin and files on S3 won't be available. (that's what you expect).
Option 2
You don't delete the whole website on s3, but you compare the diffs on current build with previous tag in Git and looking for files/folders being deleted - and via script generate list of files that needs to be deleted correspondingly from S3.
I had to create a custom AWS construct to replace S3DeployAction to deploy to S3 and delete old files. The construct diffs the changes using a lambda function to delete old files and upload new files to S3. You can use the Typescript construct from here.
I have been working using AWS Amplify as a contained solution and alternative for a static website. Every commit/PR on the repository starts a pipeline. It is important to understand storage size cost in comparison with S3 and service costs. There is a quick tutorial at > https://aws.amazon.com/getting-started/hands-on/host-static-website/ if you are interested trying it out. Hope that helps!
Contenido relevante
- OFICIAL DE AWSActualizada hace 2 años
- OFICIAL DE AWSActualizada hace 3 años
- OFICIAL DE AWSActualizada hace un año
Thanks for the quick response and the options, Aleksandr.
The issue we will have with Option 1 is that our pipeline is deploying the site to three different environments, so it will look like this: Source > Build > DeployToDev > DeployToQA > DeployToProd
In this case, we cannot do the cleanup of the three buckets in the Build stage, because it's possible that the transitions to prod have been disabled for QA to do some manual testing. I guess we can have a build stage before each deployment stage, and that can delete the files in the bucket just before the new deployment.
We already have cloudfront in place, so that would help us to make the deployment smoother.
Option 2 feels a bit hacky in my opinion, so we will probably go for some variation of option 1.
Thanks again.