AWS S3 upload using python

0

I'm trying to learn how to upload data to AWS S3 using python. I've partially coded a script that does upload files and directories, but I was wondering if there was anyway I could have someone check it out and tell me what was good or bad about the script. It works, but I want to do more with it. Is that possible in this forum? I'm not sure I want to post the script here. Maybe I could email it or send it in a message to some experienced AWS python developers?

  • Maybe I should just ask my questions. For example, I see that an option for the TransferConfig has max_concurrency as an option, which would take a number for the number of threads to use. Does this mean we don't need to create a pool for the threads? Does AWS do that?

  • How do we put the logs from the various threads to CloudWatch or save them if it's true we don't have to create a pool?

  • I'd love to be able to get a message to AWS about their documentation. It's pretty bad. I don't expect examples matching what I want to do, but I have decades of experience in unix, mainframe and windows development and I've worked in over 20 computer languages, but this documentation is pretty poor. The more I read, the more questions I have and I have no way of getting them answered except here. There is no training that will answer them, either, that I know of. It's like AWS is saying that they shouldn't be the source of documentation, but the internet should be the source.

  • If AWS is listening to this, I'll give an example of their poor documentation. Here's a link: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/boto3.html On this page, it describes boto3.set_stream_logger which has a parameter of level. Nowhere does it say what the options are for that parameter and this is done throughout their documentation. Like I said, I have a lot of experience and I have checked out documentation from many different companies. This is pretty bad documentation.

質問済み 2年前392ビュー
4回答
0
承認された回答

My understanding is that the treading means that you can upload multiple files at the same time to improve efficiency if the source system is capable of running in that manner.

For CloudWatch there would will only be entries if you have enable CloudTrail to log to CloudWatch. This would monitor management and data level actions and wouldn't matter if multi-tread or multi-part from what i've seen.

profile picture
回答済み 2年前
0
profile pictureAWS
エキスパート
回答済み 2年前
0

I work with PHP more than Python but AWS provide very useful SDKs for the majority of their services: https://aws.amazon.com/developer/language/python/

回答済み 2年前
0

Here is an s3 upload example from the doc. Here is a working example from one of the POCs. You can check these to know how it's done against what you have. Hope this helps!

AWS
Kunal_G
回答済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ