How can I test connection to legacy DB using GlueContext of Spark


Hi, I'm running a data pipeline from legacy DB(oracle) to Redshift using AWS Glue. I want to test the connection to the legacy DB before executing ETL without test query in working python script.

as-is process of our script is legacy db connects and query > load data to s3

I want it as to-be process of our script

test legacy db connections > if failed > no etl

test legacy db connections > if succeeded > query > load data to s3

so, I script like follwing above of the code for ETL . this code is using the libs and param pyspark.contxt, format("jdbc").

connection_oracle_options = { "url": "jdbc:oracle:thin:@//<jdbc-host-name>:1521/ORCL", "user": "admin", "password": "pwd"}

I did not input argument "dbtable"

I read this param is not 'Required' in

When I do this, can the connection test be achieved? or better options?

asked a year ago297 views
1 Answer

Hi, you may want to follow this testing process:

profile pictureAWS
answered a year ago
  • thanks for answer : ) but I wanna test every glue job run. because test will be executed in etl script. my purpose is before excuting ETL of glue, then failed connect(occured of wrong connections informatins or network error etc.), not excute following etl.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions