How can I run a post action script while writing to redshift from aws glue?

0
        df.toDF().write.format("jdbc").\
        option("url", "").\
        option("dbtable", f"public.{tableName}_staging").\
        option("user", "").\
        option("password", "").\
        option("postactions", f"DROP TABLE IF EXISTS $staging").\
        mode('append').save()

I have mention the post actions script in the option but it is still not executed, is there a different way to execute the post action script from redshift in aws glue notebook I used the script from https://stackoverflow.com/questions/49735489/upsert-from-aws-glue-to-amazon-redshift

asked a year ago334 views
1 Answer
0

Please note postaction is a feature of Glue DynamicFrame while you are converting there to a standard Spark DataFrame (toDF()) and using the standard JDBC driver instead of the Glue native Redshift connector.

My advice is that you write that DynamicFrame using a sink provided by GlueContext (like in the example you reference). Otherwise you would have to open your own connector to Redshift to run that DROP DDL (e.g. opening a JDBC connection or a Python library like psycopg2)

profile pictureAWS
EXPERT
answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions