Network Error when Uploading large (158 Gigabyte) file to S3 Bucket.

0

Hello, I've been trying to upload a Large 158 Gigabyte file to an S3 Bucket, but I keep getting a "Network Error" and the connection breaks. Is there a reliable way to do this? And if so, is there documentation about it? Thanks in advance. Rajnesh

질문됨 일 년 전1855회 조회
3개 답변
0

Hi Rainesh,

unless you have already tried those options, there are a couple of approaches:

hope it helps ;)

profile picture
전문가
답변함 일 년 전
0

You didn't say how you are uploading the file. I wrote a python application that uploads backups from my backup software, which are large, too. I'm uploading a 90 GB file right now. If that's how you are doing this, I can give you more pointers.

There's another pointer that might help, too, that doesn't matter which method you use. It has to do with increasing the amount of retries and you change your config file for that. Here's the documentation for that. Let me know if you need more information on the python method. https://boto3.amazonaws.com/v1/documentation/api/latest/guide/retries.html

답변함 일 년 전
0

Also, since you are uploading large files and they are failing, you may have failed multipart uploads that AWS charges for. I have two articles on how to create a lifecycle rule that deletes these failed uploads. I would set the time period to a day at first, so you don't have to wait long for them to be deleted. You can't see them in your bucket, but you can see them using storage lens. You may want to open a support ticket with AWS if you've been paying for these files. Be prepared to tell them which buckets and how far to go back in time.

https://aws.amazon.com/blogs/aws-cloud-financial-management/discovering-and-deleting-incomplete-multipart-uploads-to-lower-amazon-s3-costs/

https://aws.amazon.com/blogs/aws/s3-lifecycle-management-update-support-for-multipart-uploads-and-delete-markers/

답변함 일 년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠