Is there a way to use a SQL parameterized query with aws_s3.query_export_to_s3?

0

Hello,

I want to export data from some Aurora PostgreSQL database to a S3 bucket using aws_s3.query_export_to_s3 (related documentation here).

Using aws_s3.query_export_to_s3 with a simple SQL query (e.g. SELECT * FROM some_table) works perfectly, however it does not work with a parameterized queries. I would like to use a parameterized query as the standard way to avoid SQL injection, for instance:

select *
from aws_s3.query_export_to_s3(
        'select * from some_table where some_column = $1',
        aws_commons.create_s3_uri('some-bucket', 'test.csv', 'eu-west-3'),
        options:='format csv, header, delimiter $$,$$'
    );

Is there a way to achieve this?

Or is there any plan to support parameterized queries in the future?

Thanks a lot.

Pierre
已提問 5 個月前檢視次數 198 次
2 個答案
0

Can you parameterize it the same way you'd parameterize a regular query? From what I could find, Postgres doesn't do parameterization on its own. This is normally provided by the calling code, for example pg-template-tag for Node.js.

已回答 5 個月前
  • Hello,

    Postgres doesn't do parameterization on its own.

    What client-side libraries usually do is:

    • rewrite parameterized queries (e.g. transform some :value into $1)
    • then send a PREPARE query to postgres
    • and finally an EXECUTE (with the parameters)

    In other words: parameterized queries are handled as prepared statements.

    I realized that PREPARE only handles SELECT, INSERT, UPDATE, DELETE, MERGE, or VALUES statements (cf. https://www.postgresql.org/docs/current/sql-prepare.html).

    So, knowing that aws_s3.query_export_to_s3 uses COPY behind the scenes, there is no way to use parameterized queries with it.

    Closing this question then (EDIT: no way to do that?).

0

Ah, TIL. Thanks for the info.

已回答 5 個月前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南