error message "Please reduce your request rate" for a script used for amazon robotics hwte database in Athena

0

i have the following script to be used in Athena query for AR product:

-- Look for drive units containing these motor assemblies. SELECT pt_floor, du_id, subsystem, serial_num, part_num, rev_num, mfg_date, max(file_datetime) as most_recent FROM "croklogs"."cklg_var_header_1_eu" WHERE serial_num in ('10N210308426', '10N203409223', '10N212444598', '10N212448969', '10N212446784', '10N212453822') -- Pick a day from before the part was returned AND pt_date >= date '2022-01-01' AND pt_date < date '2024-02-20' GROUP BY pt_floor, du_id, subsystem, serial_num, part_num, rev_num, mfg_date

i received the following error message:

HIVE_CURSOR_ERROR: com.amazonaws.services.s3.model.AmazonS3Exception: Please reduce your request rate. (Service: Amazon S3; Status Code: 503; Error Code: SlowDown; Request ID: C0FSYKHMTDG6A10K; S3 Extended Request ID: /Bekz/IDaFumL+ah0Cde4mRuN4QYKAfviCn+FotPSkwnAqW0FAVy94mFGTA4znwOyU2aFN2s0t8=; Proxy: null), S3 Extended Request ID: /Bekz/IDaFumL+ah0Cde4mRuN4QYKAfviCn+FotPSkwnAqW0FAVy94mFGTA4znwOyU2aFN2s0t8= This query ran against the "amazon_robotics_hwte" database, unless qualified by the query. Please post the error message on our forum or contact customer support with Query Id: fcd3bc91-af69-43d3-80a7-6657f0545053

Jason
asked 2 months ago125 views
1 Answer
0

Greetings,

Thank you for reaching out to AWS via this post.

From your post I understand that you are receiving the error below when running a query in Athena:

---------------------------
HIVE_CURSOR_ERROR: com.amazonaws.services.s3.model.AmazonS3Exception: Please reduce your request rate.
---------------------------

Please correct me if I have misunderstood anything.

This error indicates that you are hitting the maximum request rate limit for Amazon Simple Storage Service (Amazon S3). It typically occurs when an S3 bucket prefix has a large number of objects. An Amazon S3 bucket can handle 3,500 PUT/COPY/POST/DELETE or 5,500 GET/HEAD requests per second per prefix in a bucket as explained in resource [1].

Amazon S3 scales automatically to support very high request rates by default, however, if the request threshold is exceeded, you will then receive 5xx errors asking you to slow down or try again later. Please note that this is a combined limit across all users and services across the account.

For potential workarounds with regards to alleviating the error, please refer to resources [2] and [3]. Additionally, I have also attached resource [4] which provides information on partitioning data which is also a technique that assists with this error, as well as the Top 10 Performance Tuning Tips for Amazon Athena in resource [5].

I hope you found the information provided helpful.

Wishing you a great day further!

Resources:

[1] Best practices design patterns: optimizing Amazon S3 performance - https://docs.aws.amazon.com/AmazonS3/latest/userguide/optimizing-performance.html

[2] How do I troubleshoot a HTTP 500 or 503 error from Amazon S3 - https://repost.aws/knowledge-center/http-5xx-errors-s3

[3] Organizing objects using prefixes - https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-prefixes.html

[4] Partitioning data in Athena - https://docs.aws.amazon.com/athena/latest/ug/partitions.html

[5]Top 10 Performance Tuning Tips for Amazon Athena - https://aws.amazon.com/blogs/big-data/top-10-performance-tuning-tips-for-amazon-athena/

AWS
answered 2 months ago
profile picture
EXPERT
reviewed 25 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions