How to transfer dataset (whole dataset at once) from Bigquery to S3 by AWS Glue?

0

Hi Dears

Hope all is great with you.

I have tried to migrate data from google big query to AWS S3. I have an issue from source side as below (console).

Connection options:

Enter additional key-value pairs for your data source connection:

**Key: parentProject value: serene-craft-3363XX

**Key: table value: bigquery-public-data:austin_bikeshare.bikeshare_stations

The question that if i want to put dataset instead of table, how that will be??

Noting that I tried write ( dataset ) but not working!

Can you advise please? Thanks in advance Basem

1回答
0
承認された回答

Hi,

If I guess correctly you are using AWS Glue Studio and the AWS Glue big Query connector.

Currently the Glue Big query connector is working at table level (as the BigQuery Spark Connector does).

If you want to export all the tables in a dataset you may edit the script generated by Glue Studio and customize it.

you would first need to add the google.cloud python library using the method mentioned here.

then before you read the table, you read the list of tables in the dataset as described here.

Finally you iterate on the tables and you read/write them to S3.

This is one possibility the other would be to use an orchestrator as StepFunctions (an alternative could be Airflow), to run a python script to read the list of tables, and then execute the your job (once parametrized by tablename) in parallel for each table.

hope this helps

AWS
エキスパート
回答済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ