Network Error when Uploading large (158 Gigabyte) file to S3 Bucket.

0

Hello, I've been trying to upload a Large 158 Gigabyte file to an S3 Bucket, but I keep getting a "Network Error" and the connection breaks. Is there a reliable way to do this? And if so, is there documentation about it? Thanks in advance. Rajnesh

已提问 1 年前1852 查看次数
3 回答
0

Hi Rainesh,

unless you have already tried those options, there are a couple of approaches:

hope it helps ;)

profile picture
专家
已回答 1 年前
0

You didn't say how you are uploading the file. I wrote a python application that uploads backups from my backup software, which are large, too. I'm uploading a 90 GB file right now. If that's how you are doing this, I can give you more pointers.

There's another pointer that might help, too, that doesn't matter which method you use. It has to do with increasing the amount of retries and you change your config file for that. Here's the documentation for that. Let me know if you need more information on the python method. https://boto3.amazonaws.com/v1/documentation/api/latest/guide/retries.html

已回答 1 年前
0

Also, since you are uploading large files and they are failing, you may have failed multipart uploads that AWS charges for. I have two articles on how to create a lifecycle rule that deletes these failed uploads. I would set the time period to a day at first, so you don't have to wait long for them to be deleted. You can't see them in your bucket, but you can see them using storage lens. You may want to open a support ticket with AWS if you've been paying for these files. Be prepared to tell them which buckets and how far to go back in time.

https://aws.amazon.com/blogs/aws-cloud-financial-management/discovering-and-deleting-incomplete-multipart-uploads-to-lower-amazon-s3-costs/

https://aws.amazon.com/blogs/aws/s3-lifecycle-management-update-support-for-multipart-uploads-and-delete-markers/

已回答 1 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则