How to transfer dataset (whole dataset at once) from Bigquery to S3 by AWS Glue?

0

Hi Dears

Hope all is great with you.

I have tried to migrate data from google big query to AWS S3. I have an issue from source side as below (console).

Connection options:

Enter additional key-value pairs for your data source connection:

**Key: parentProject value: serene-craft-3363XX

**Key: table value: bigquery-public-data:austin_bikeshare.bikeshare_stations

The question that if i want to put dataset instead of table, how that will be??

Noting that I tried write ( dataset ) but not working!

Can you advise please? Thanks in advance Basem

profile picture
已提問 2 年前檢視次數 546 次
1 個回答
0
已接受的答案

Hi,

If I guess correctly you are using AWS Glue Studio and the AWS Glue big Query connector.

Currently the Glue Big query connector is working at table level (as the BigQuery Spark Connector does).

If you want to export all the tables in a dataset you may edit the script generated by Glue Studio and customize it.

you would first need to add the google.cloud python library using the method mentioned here.

then before you read the table, you read the list of tables in the dataset as described here.

Finally you iterate on the tables and you read/write them to S3.

This is one possibility the other would be to use an orchestrator as StepFunctions (an alternative could be Airflow), to run a python script to read the list of tables, and then execute the your job (once parametrized by tablename) in parallel for each table.

hope this helps

AWS
專家
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南