Data Wrangler: Data Flow: Export to S3 using Jupyter Notebook

0

When I have created a Data flow using data wrangler and when I am trying to Export to export to S3 using Jupyter Notebook and when I am running the notebook, I am getting the below mentioned error every time when creating a processing job:

Error: An error occurred (ResourceLimitExceeded) when calling the CreateProcessingJob operation: The account-level service limit 'ml.m5.4xlarge for processing job usage' is 0 Instances, with current utilization of 0 Instances and a request delta of 2 Instances. Please contact AWS support to request an increase for this limit.

Please provide me with the solution for this. I have increased the service quota for running apps and Notebook instance also but then also same issue arises.

質問済み 2年前324ビュー
1回答
1

When you export a data flow to S3, you're starting a SageMaker processing job that will run the script and store data in S3.

So, increase your account limit for Processing Job instances, for the instance type ml.m5.4xlarge (instead of running apps/notebook instances).

AWS
Durga_S
回答済み 2年前
AWS
エキスパート
Alex_T
レビュー済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ