I've done exactly what you are trying to do and the issue is that your parts are large enough. Specify a part like multipart_chunksize=10241024250. The chunksize I believe is in bytes and you are limited to 10,000 parts, so that's what is going on here.
Since you are using mulipart uploads, I want to be sure you know about this. I recently found out that failed multipart uploads are saved in S3 and I pay for them. Here's how you can get rid of them. You will see that the metrics shows more objects and more storage than you can see in your bucket if this is happening. I would ask for a refund of those costs if you have them. Here's an article on how to create a lifecycle policy to automatically delete these failed parts. https://aws.amazon.com/blogs/aws/s3-lifecycle-management-update-support-for-multipart-uploads-and-delete-markers/
Relevant questions
Upload large file with multiupload for cross s3 accounts using Boto3
Accepted Answerasked a month agoS3 SHA256 Checksum for Presigned URL in File Upload
asked 5 months agoUsing S3 bucket as a file server for the public
asked 2 months agoUploading greater than 6 MB file to S3 through API Gateway results in Request too long error, Is that expected ?
asked 4 months agoFile Upload Issues
asked 2 years agoBUG: S3 upload response data is different per file size.
asked 8 months agoHow to upload large Excel file into Quicksight?
Accepted AnswerHow to keep the source file name in the target output file with a AWS Glue job
Accepted Answerasked 2 years agoHow do I upload a checkpoint file to my Github repo
asked 2 years agoupload or download larger file failed
asked 9 days ago
this was the issue. thank you!