Data Wrangler: Data Flow: Export to S3 using Jupyter Notebook

0

When I have created a Data flow using data wrangler and when I am trying to Export to export to S3 using Jupyter Notebook and when I am running the notebook, I am getting the below mentioned error every time when creating a processing job:

Error: An error occurred (ResourceLimitExceeded) when calling the CreateProcessingJob operation: The account-level service limit 'ml.m5.4xlarge for processing job usage' is 0 Instances, with current utilization of 0 Instances and a request delta of 2 Instances. Please contact AWS support to request an increase for this limit.

Please provide me with the solution for this. I have increased the service quota for running apps and Notebook instance also but then also same issue arises.

1 Antwort
1

When you export a data flow to S3, you're starting a SageMaker processing job that will run the script and store data in S3.

So, increase your account limit for Processing Job instances, for the instance type ml.m5.4xlarge (instead of running apps/notebook instances).

AWS
Durga_S
beantwortet vor 2 Jahren
AWS
EXPERTE
Alex_T
überprüft vor 2 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen