AWS Glue Include and Exclude Patterns: unexpected behaviour

0

I am writing a script using AWS Glue 3.0 using PySpark to read data from an S3 bucket, perform some transformations and write to an S3 bucket. To achieve this I am using GlueContext.create_dynamic_frame_from_options with the connection type S3. There is an optional parameter exclusions which uses Include and Exclude Patterns based on the glob syntax. In the S3 bucket there are deeply nested files and I only wish to read files with the extension .json and wish to exclude files with the extension .csv and .txt. To achieve this I have the following glob expressions "exclusions": ['**/*.csv', '**/*.txt']. When executing the PySpark script below, I get the following error: An error occurred while calling o90.pyWriteDynamicFrame. Unable to parse file: <file-name>.data.csv, where the <file-name> is replaced with the name of the file.

dyf_read_source_s3 = glueContext.create_dynamic_frame.from_options(
    connection_type="s3",
    format="json",
    connection_options={
        "paths": [<path>],
        "exclusions": ['**/*.csv', '**/*.txt'],
        "recurse": True,
        "groupFiles": "inPartition",
    },
    transformation_ctx="dyf_read_source_s3",
)

I have locally re-created an example of the dir structure and imported the glob module which successfully uses this syntax to extract the correct files which leaves me to believe there is an issue/bug with the source code.

已提問 2 年前檢視次數 85 次
沒有答案

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南