1 回答
- 最新
- 投票最多
- 评论最多
1
Hi, I faced a similar issue to satisfy a requirement for loading data from Redshift to a MySQL database. It seems that there is no overwrite function for data frames when dealing with MySQL. What was implemented was before the data load I truncate the required table by connecting to the RDS.
Example Code
secret_value = client_secret.get_secret_value(SecretId=secret_name) secret_string = secret_value['SecretString'] secret_json = json.loads(secret_string) region = session.region_name user_name = secret_json["username"] password = secret_json["password"] host = secret_json["host"] port = secret_json["port"] dbname = "<database>" table_name = "<table>" conn = pymysql.connect(host=host, user=user_name, passwd=password, port=port, database=dbname) cur = conn.cursor() query = "TRUNCATE TABLE {0}.{1}".format(dbname,table_name) cur.execute(query) conn.commit()
I hope this helps
已回答 1 年前
相关内容
- AWS 官方已更新 3 年前
- AWS 官方已更新 2 年前
- AWS 官方已更新 1 年前
Yes preactions are only for Redsfhit. To add to that solution, if the connection details are in the connection instead of the secret, there is the an API in GlueContext to read it from them so you can pass it to the connect()