- Newest
- Most votes
- Most comments
The issue is that by default the ESCAPE character needs to be before every occurrence of the character you want to escape. So, in your case, you would need to modify the CSV file itself to have a double quote before any internal double quotes: 32533|Lease changed from ""12 month"" to ""month"" Then your ESCAPE syntax should work: ESCAPE '"' However, an easier approach is to use the QUOTE parameter instead of ESCAPE. This allows you to specify a quoting character, and any instances of that character inside a field will be treated as part of the value instead of ending the field. For example: QUOTE '"' With this, you don't need to modify the CSV file at all. Any double quotes inside fields will be preserved as part of the value when imported. So, the full SQL would be:
SELECT aws_s3.table_import_from_s3 (
'my_table',
' ',
'(format csv, QUOTE ''"'', NULL ''NULL'', DELIMITER ''|'' )' ,
(SELECT aws_commons.create_s3_uri('my_bucket','my_file.csv','us-east-1')) ,
aws_commons.create_aws_credentials('xxxxx','xxxxx','')
);
Let me know if that helps resolve the issue! The QUOTE option is usually the easiest way to handle quoting characters when importing CSV.
Relevant content
- asked 2 years ago
- asked a year ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago
Thank you for your comment. QUOTE ''"'' still do not work. However, I noticed that QUOTE E''\b'' will work. Do you know why is that?