Loading data from RDS postgres database to S3 bucket but getting Unable to execute HTTP request error.

0

Hi all. I created a Glue job that extracts data from RDS postgresql database and load to S3 bucket. i used crawler to create schema of RDS postgresql source database. But when i run the job it keeps running almost one hour and after got failed with the following error. I have created role that give AWSGluerole acces to it which i am using for running the glue job. I have given permissions to lake formation and also added inline policy to access to s3 bucket. Any solution will be highly appreciated.The image of error is also attached

22/10/31 13:55:48 ERROR GlueExceptionAnalysisListener: [Glue Exception Analysis] { "Event": "GlueExceptionAnalysisTaskFailed", "Timestamp": 1667224548950, "Failure Reason": "Unable to execute HTTP request: Connect to my-first-bucket-0.s3.ap-northeast-1.amazonaws.com:443 [my-first-bucket-0.s3.ap-northeast-1.amazonaws.com/52.219.196.98, my-first-bucket-0.s3.ap-northeast-1.amazonaws.com/52.219.8.63, my-first-bucket-0.s3.ap-northeast-1.amazonaws.com/52.219.4.75, my-first-bucket-0.s3.ap-northeast-1.amazonaws.com/52.219.16.187, my-first-bucket-0.s3.ap-northeast-1.amazonaws.com/52.219.1.27, my-first-bucket-0.s3.ap-northeast-1.amazonaws.com/52.219.152.58] failed: connect timed out", "Stack Trace": [ { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor", "Method Name": "handleRetryableException", "File Name": "AmazonHttpClient.java", "Line Number": 1207 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor", "Method Name": "executeHelper", "File Name": "AmazonHttpClient.java", "Line Number": 1153 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor", "Method Name": "doExecute", "File Name": "AmazonHttpClient.java", "Line Number": 802 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor", "Method Name": "executeWithTimer", "File Name": "AmazonHttpClient.java", "Line Number": 770 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor", "Method Name": "execute", "File Name": "AmazonHttpClient.java", "Line Number": 744 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor", "Method Name": "access$500", "File Name": "AmazonHttpClient.java", "Line Number": 704 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl", "Method Name": "execute", "File Name": "AmazonHttpClient.java", "Line Number": 686 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient", "Method Name": "execute", "File Name": "AmazonHttpClient.java", "Line Number": 550 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.http.AmazonHttpClient", "Method Name": "execute", "File Name": "AmazonHttpClient.java", "Line Number": 530 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client", "Method Name": "invoke", "File Name": "AmazonS3Client.java", "Line Number": 5140 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client", "Method Name": "invoke", "File Name": "AmazonS3Client.java", "Line Number": 5086 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client", "Method Name": "access$300", "File Name": "AmazonS3Client.java", "Line Number": 394 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client$PutObjectStrategy", "Method Name": "invokeServiceCall", "File Name": "AmazonS3Client.java", "Line Number": 6032 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client", "Method Name": "uploadObject", "File Name": "AmazonS3Client.java", "Line Number": 1812 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.services.s3.AmazonS3Client", "Method Name": "putObject", "File Name": "AmazonS3Client.java", "Line Number": 1772 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3.lite.call.PutObjectCall", "Method Name": "performCall", "File Name": "PutObjectCall.java", "Line Number": 35 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3.lite.call.PutObjectCall", "Method Name": "performCall", "File Name": "PutObjectCall.java", "Line Number": 10 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3.lite.call.AbstractUploadingS3Call", "Method Name": "perform", "File Name": "AbstractUploadingS3Call.java", "Line Number": 87 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3.lite.executor.GlobalS3Executor", "Method Name": "execute", "File Name": "GlobalS3Executor.java", "Line Number": 114 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3.lite.AmazonS3LiteClient", "Method Name": "invoke", "File Name": "AmazonS3LiteClient.java", "Line Number": 191 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3.lite.AmazonS3LiteClient", "Method Name": "invoke", "File Name": "AmazonS3LiteClient.java", "Line Number": 186 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3.lite.AmazonS3LiteClient", "Method Name": "putObject", "File Name": "AmazonS3LiteClient.java", "Line Number": 107 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3n.Jets3tNativeFileSystemStore", "Method Name": "storeFile", "File Name": "Jets3tNativeFileSystemStore.java", "Line Number": 152 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3n.MultipartUploadOutputStream", "Method Name": "uploadSinglePart", "File Name": "MultipartUploadOutputStream.java", "Line Number": 198 }, { "Declaring Class": "com.amazon.ws.emr.hadoop.fs.s3n.MultipartUploadOutputStream", "Method Name": "close", "File Name": "MultipartUploadOutputStream.java", "Line Number": 427 }, { "Declaring Class": "org.apache.hadoop.fs.FSDataOutputStream$PositionCache", "Method Name": "close", "File Name": "FSDataOutputStream.java", "Line Number": 73 }, { "Declaring Class": "org.apache.hadoop.fs.FSDataOutputStream", "Method Name": "close", "File Name": "FSDataOutputStream.java", "Line Number": 102 }, { "Declaring Class": "com.fasterxml.jackson.dataformat.csv.impl.UTF8Writer", "Method Name": "close", "File Name": "UTF8Writer.java", "Line Number": 74 }, { "Declaring Class": "com.fasterxml.jackson.dataformat.csv.impl.CsvEncoder", "Method Name": "close", "File Name": "CsvEncoder.java", "Line Number": 989 }, { "Declaring Class": "com.fasterxml.jackson.dataformat.csv.CsvGenerator", "Method Name": "close", "File Name": "CsvGenerator.java", "Line Number": 479 }, { "Declaring Class": "com.amazonaws.services.glue.writers.JacksonWriter", "Method Name": "done", "File Name": "JacksonWriter.scala", "Line Number": 73 }, { "Declaring Class": "com.amazonaws.services.glue.hadoop.TapeOutputFormat$$anon$1", "Method Name": "close", "File Name": "TapeOutputFormat.scala", "Line Number": 217 }, { "Declaring Class": "com.amazonaws.services.glue.sinks.HadoopWriters$", "Method Name": "$anonfun$writeNotPartitioned$3", "File Name": "HadoopWriters.scala", "Line Number": 125 }, { "Declaring Class": "org.apache.spark.util.Utils$", "Method Name": "tryWithSafeFinallyAndFailureCallbacks", "File Name": "Utils.scala", "Line Number": 1495 }, { "Declaring Class": "org.apache.spark.sql.glue.SparkUtility$", "Method Name": "tryWithSafeFinallyAndFailureCallbacks", "File Name": "SparkUtility.scala", "Line Number": 39 }, { "Declaring Class": "com.amazonaws.services.glue.sinks.HadoopWriters$", "Method Name": "writeNotPartitioned", "File Name": "HadoopWriters.scala", "Line Number": 125 }, { "Declaring Class": "com.amazonaws.services.glue.sinks.HadoopWriters$", "Method Name": "$anonfun$doStreamWrite$1", "File Name": "HadoopWriters.scala", "Line Number": 138 }, { "Declaring Class": "com.amazonaws.services.glue.sinks.HadoopWriters$", "Method Name": "$anonfun$doStreamWrite$1$adapted", "File Name": "HadoopWriters.scala", "Line Number": 129 }, { "Declaring Class": "org.apache.spark.scheduler.ResultTask", "Method Name": "runTask", "File Name": "ResultTask.scala", "Line Number": 90 }, { "Declaring Class": "org.apache.spark.scheduler.Task", "Method Name": "run", "File Name": "Task.scala", "Line Number": 131 }, { "Declaring Class": "org.apache.spark.executor.Executor$TaskRunner", "Method Name": "$anonfun$run$3", "File Name": "Executor.scala", "Line Number": 497 }, { "Declaring Class": "org.apache.spark.util.Utils$", "Method Name": "tryWithSafeFinally", "File Name": "Utils.scala", "Line Number": 1439 }, { "Declaring Class": "org.apache.spark.executor.Executor$TaskRunner", "Method Name": "run", "File Name": "Executor.scala", "Line Number": 500 }, { "Declaring Class": "java.util.concurrent.ThreadPoolExecutor", "Method Name": "runWorker", "File Name": "ThreadPoolExecutor.java", "Line Number": 1149 }, { "Declaring Class": "java.util.concurrent.ThreadPoolExecutor$Worker", "Method Name": "run", "File Name": "ThreadPoolExecutor.java", "Line Number": 624 }, { "Declaring Class": "java.lang.Thread", "Method Name": "run", "File Name": "Thread.java", "Line Number": 750 } ], "Task Launch Time": 1667223676080, "Stage ID": 1, "Stage Attempt ID": 0, "Task Type": "ResultTask", "Executor ID": "9", "Task ID": 11 }

gefragt vor 2 Jahren398 Aufrufe
1 Antwort
0
profile pictureAWS
EXPERTE
beantwortet vor 2 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen