Hello,
I want to export data from some Aurora PostgreSQL database to a S3 bucket using aws_s3.query_export_to_s3
(related documentation here).
Using aws_s3.query_export_to_s3
with a simple SQL query (e.g. SELECT * FROM some_table
) works perfectly, however it does not work with a parameterized queries. I would like to use a parameterized query as the standard way to avoid SQL injection, for instance:
select *
from aws_s3.query_export_to_s3(
'select * from some_table where some_column = $1',
aws_commons.create_s3_uri('some-bucket', 'test.csv', 'eu-west-3'),
options:='format csv, header, delimiter $$,$$'
);
Is there a way to achieve this?
Or is there any plan to support parameterized queries in the future?
Thanks a lot.
Hello,
What client-side libraries usually do is:
:value
into$1
)In other words: parameterized queries are handled as prepared statements.
I realized that PREPARE only handles SELECT, INSERT, UPDATE, DELETE, MERGE, or VALUES statements (cf. https://www.postgresql.org/docs/current/sql-prepare.html).
So, knowing that
aws_s3.query_export_to_s3
uses COPY behind the scenes, there is no way to use parameterized queries with it.Closing this question then (EDIT: no way to do that?).