AWS Glue Include and Exclude Patterns: unexpected behaviour

0

I am writing a script using AWS Glue 3.0 using PySpark to read data from an S3 bucket, perform some transformations and write to an S3 bucket. To achieve this I am using GlueContext.create_dynamic_frame_from_options with the connection type S3. There is an optional parameter exclusions which uses Include and Exclude Patterns based on the glob syntax. In the S3 bucket there are deeply nested files and I only wish to read files with the extension .json and wish to exclude files with the extension .csv and .txt. To achieve this I have the following glob expressions "exclusions": ['**/*.csv', '**/*.txt']. When executing the PySpark script below, I get the following error: An error occurred while calling o90.pyWriteDynamicFrame. Unable to parse file: <file-name>.data.csv, where the <file-name> is replaced with the name of the file.

dyf_read_source_s3 = glueContext.create_dynamic_frame.from_options(
    connection_type="s3",
    format="json",
    connection_options={
        "paths": [<path>],
        "exclusions": ['**/*.csv', '**/*.txt'],
        "recurse": True,
        "groupFiles": "inPartition",
    },
    transformation_ctx="dyf_read_source_s3",
)

I have locally re-created an example of the dir structure and imported the glob module which successfully uses this syntax to extract the correct files which leaves me to believe there is an issue/bug with the source code.

質問済み 2年前85ビュー
回答なし

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン