1回答
- 新しい順
- 投票が多い順
- コメントが多い順
0
Hi,
if you have bookmark enabled, are you sure you have new data in S3 for the second run?
If not the read step will create an empty dataframe that might cause the write to BigQuery to fail.
Currently you might want to implement a try/catch or conditional logic to test if the dataframe you read has data and writes to bigquery only if true otherwise only log a message that there is no available input at the moment.
Hope this helps,
関連するコンテンツ
- 質問済み 6ヶ月前
- AWS公式更新しました 3年前
Yes, more data is present in S3, I have printed the data and checked just before writing, but still it is throwing this error. I thought maybe something around nullability of the columns, but have fixed that too, by setting the nullable property of source to True same as target , but still the same error. I am clueless now!