All Content tagged with Amazon Simple Storage Service
Amazon Simple Storage Service (Amazon S3) is an object storage service offering industry-leading scalability, data availability, security, and performance.
Content language: English
Select up to 5 tags to filter
Sort by most recent
We have a key scheme like the example below. The end of the image name is a timestamp. We perform distributed processing over the images per `right` folder (1k+ workers over 10k+ images under...
In looking [here](https://docs.aws.amazon.com/AmazonS3/latest/userguide/MultiFactorAuthenticationDelete.html) regarding enabling MFA deletion on an S3 bucket, and checking the [AWS CLI v2...
Hi everyone,
I’m running into an issue with S3 presigned URLs generated during a CodeBuild stage. In my setup, I’m specifying that the URL should expire after 24 hours, but it seems like it’s...
Hi,
I need to transfer a logs directory from an EC2 instance to an S3 bucket daily at a specified time. What AWS-managed solutions are available for this? Additionally, I need to scale the solution...
I am trying to create a Kinesis Firehose stream that can directly write to Iceberg tables in S3. I have defined the Glue Data Catalog in the same account and created a bucket to hold the metadata.
...
Hello,
After I've made a redirect to an external URL, using Route 53 and S3 (please see below S3 setup), the page does not load, or load hardly in any browser or application (Facebook, other site),...
I have been trying to import data in Sagemaker Canvas Data Wrangler from an S3 bucket. Both Canvas and S3 are in the same region and I tried changing the CORS settings as mentioned in the post...
I have a requirement to transfer large volume of s3 data(10TB) one time from one AWS account to another and 1TB of data on demand(once/twice a month). This...
Hi AWS, we have a setup in an existing AWS Account comprising S3 buckets, DynamoDB tables, and EFS. These are all running actively. They need to be migrated to a new AWS Account.
One of the approach...
I'm generating pre-signed urls using a lambda function. The application then uses the URLs to upload files to a S3 bucket.
Problem: I'm trying to upload files to recently created buckets. In...
> Resource handler returned message:
"Invalid request provided: DataSync location access test failed: could not perform S3:ListObjectsV2 on bucket *my_bucket* Access denied. Ensure bucket access role...