Questions tagged with Amazon Simple Storage Service

Content language: English

Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

How can i fetch s3 bucket name associated with the given RDS DB identifier name using java code?
1
answers
0
votes
28
views
asked 3 days ago
I converted a CSV(from S3) to parquet(to S3) using AWS glue and the file which is converted to Parquet was named randomly .How do i choose the name of the file that is to be converted to Parquet from CSV ? ![Enter image description here](/media/postImages/original/IMUQds6rTFS8i2Yv9YENybcQ) when i add data.parquet at the end of the s3 path (in target) without '/' ,AWS glues is creating a subfloder in the bucket with the name data.parquet instead of file name, where as the new file parquet file is created with the name like this "run-1678983665978-part-block-0-r-00000-snappy.parquet" where should i give a name to the parquet file ?
1
answers
0
votes
29
views
asked 4 days ago
From [customize-file-delivery-notifications-using-aws-transfer-family-managed-workflows](https://aws.amazon.com/blogs/storage/customize-file-delivery-notifications-using-aws-transfer-family-managed-workflows/) blog, it reads AWS Transfer Family is a secure transfer service that enables you to transfer files into and out of AWS storage services. ![Blog Post snapshot](/media/postImages/original/IMsiG3a3NbSmivTOY2hVBkiA) Does this mean Transfer family support to transfer files from S3 to external servers outside of AWS? Providing my use case for better understanding: I need to transfer large files like 70-80 Gb to external server using Akamai NetStorage.
1
answers
0
votes
26
views
asked 4 days ago
Hallo guys, I'm writing to you in hope to get a help on a problem I'm facing, so basically I created a spring boot rest API for my app and I used AWS Elasticbeanstalk to deploy, now when I try to upload a file to my S3 bucket using the rest API, I'm facing an error saying that the body of my request is too large, even for some image files not larger than 1MB. So please how can I solve that issues. Here's the error part of the logs of the app: 2023/03/21 05:12:26 [error] 2736#2736: *56 client intended to send too large body: 3527163 bytes, client: ..., server: , request: "POST /mobile/create_post HTTP/1.1", host: "..."
1
answers
0
votes
28
views
asked 5 days ago
I'm communicating with an MCU via SSE and need to be able to do this over http (mcu doesn't support ssl). My react code is hosted in a static site on S3. How can I set this preflight header in the S3 bucket permissions CORS policy ? Please provide a working example if this is possible.
0
answers
0
votes
7
views
asked 5 days ago
I am trying to get a notification each time an S3 object is opened (read), and wondering if this is possible using S3 notifications. We are using a barcode scanner to open pdfs stored as S3 objects. I found a tutorial somewhere on how to do this: "For the "Events" section, select the "All object read operations" or choose "Specific object read operations" and check "GetObject". This will trigger the event when an object is opened (read)." Here's where I'm looking for notifications: S3 bucket > Create event notification > ("Specific object read operations") - cannot find this trigger. So, is there a way to send a notification or trigger a Lambda function when an object is opened?
1
answers
0
votes
30
views
srfsup
asked 5 days ago
In the last two days, my billing increased a lot due to lots of S3 ListObjects api requests. I'm sure it wasn't originated by my services. I'd like to know how can I figure out where are these requests coming from, and which buckets were accessed? I had no clue on this issue. Then I enabled CloudTrail service, but still didn't find out where S3 ListObjects came from.
1
answers
0
votes
17
views
Mason
asked 5 days ago
Hello, I've been trying to upload a Large 158 Gigabyte file to an S3 Bucket, but I keep getting a "Network Error" and the connection breaks. Is there a reliable way to do this? And if so, is there documentation about it? Thanks in advance. Rajnesh
3
answers
0
votes
26
views
asked 6 days ago
Is there any way to automatically move all WorkMail email attachments to S3 bucket and get their link of course.? If yes, how can I do that? Thanks Regards
1
answers
0
votes
22
views
asked 7 days ago
Hi AWS, I am trying to impose a condition on S3 `BucketEncryption` property whether it should be customer managed (SSE-KMS) or AWS managed key (SSE-S3). The code for the template is: ``` # version: 1.0 AWSTemplateFormatVersion: "2010-09-09" Description: Create standardized S3 bucket using CloudFormation Template Parameters: BucketName: Type: String Description: "Name of the S3 bucket" KMSKeyArn: Type: String Description: "KMS Key Arn to encrypt S3 bucket" Default: "" SSEAlgorithm: Type: String Description: "Encryption algorithm for KMS" AllowedValues: - aws:kms - AES256 Conditions: KMSKeysProvided: !Not [!Equals [!Ref KMSKeyArn, ""]] Resources: S3Bucket: Type: 'AWS::S3::Bucket' DeletionPolicy: Retain UpdateReplacePolicy: Retain Properties: BucketName: !Ref BucketName PublicAccessBlockConfiguration: BlockPublicAcls: true BlockPublicPolicy: true IgnorePublicAcls: true RestrictPublicBuckets: true BucketEncryption: ServerSideEncryptionConfiguration: - !If - KMSKeysProvided - ServerSideEncryptionByDefault: SSEAlgorithm: !Ref SSEAlgorithm KMSMasterKeyID: !Ref KMSKeyArn BucketKeyEnabled: true - !Ref "AWS::NoValue" ``` When I am selecting the SSEAlgorithm as `AES256` I am receiving this error **Property ServerSideEncryptionConfiguration cannot be empty**. I know `KMSMasterKeyID` should not be present when the SSEAlgorithm is of AES256 type but I am confused how to get rid of this error. Please help.
2
answers
0
votes
32
views
profile picture
asked 7 days ago
I set up AdministratorAccess for my role, this is a master level policy for this role to pass all the services, specially is AWS Glue, I want to create crawler for build etl pipeline and pour data to database in catalog of AWS Glue, but I stuck in the error 400 denied access. I tried many way like: - Change the credit card, set default on it - Add permission many times, still failed.
1
answers
0
votes
23
views
asked 8 days ago
Hi, I'm using a django file upload project and this issue keeps on popping up. > An error occurred (SignatureDoesNotMatch) when calling the CreateMultipartUpload operation: The request signature we calculated does not match the signature you provided. Check your key and signing method. This is my settings.py ``` AWS_STORAGE_BUCKET_NAME = 'bucket-name' AWS_S3_REGION_NAME = 'region-name' ``` My .env ``` AWS_ACCESS_KEY_ID='Access_key_ID' AWS_SECRET_ACCESS_KEY='Secret_Key_ID' ``` A detailed helpful with step-by-step procedure to solve this issue would be highly helpful as I'm super beginner with AWS (I started this project 2 days ago). Thank you
1
answers
0
votes
23
views
asked 8 days ago