Unanswered Questions tagged with AWS Glue
Content language: English
Select up to 5 tags to filter
Sort by most recent
Browse through the questions and answers listed below or filter and sort to narrow down your results.
I'm trying to run a Python script in aws glue that uses athena.get_query_runtime_statistics when I run i on my local machine the script works, but running at glue returns this error
0
answers
0
votes
34
views
asked a day agolg...
I setup a job using Glue visual, connect to the appropriate table in Oracle, and the data appears to be selected perfectly from the source:
![Data looks...
0
answers
0
votes
45
views
asked a day agolg...
I am trying to add a default to an existing field in an Avro schema in AWS Glue, but the change isn't registering as a new version.
Is this behavior expected? If so, why? If not, how can I go about...
0
answers
0
votes
63
views
asked 4 days agolg...
I have data that I collect from AWS Batch and CloudWatch. I made a lambda function that runs every day that collects those data daily and saves the result to S3. I have a folder called 'logs' and the...
0
answers
0
votes
451
views
asked 5 days agolg...
I'm experimenting with Amazon DataZone and encountered something unexpected. I have a simple setup with one AWS account and one DataZone domain, which includes:
1 Glue Table
1 S3 bucket with my...
0
answers
0
votes
261
views
asked 6 days agolg...
I'm creating a role in AWS Glue to read CSV files from an S3 bucket. I'm granting full access to S3, but I can't seem to avoid this error. I contacted support, and they suggested increasing the usage...
0
answers
0
votes
58
views
asked 12 days agolg...
This was working before, as recently as a week or two ago but Athena now fails with "INVALID_PARAMETER_USAGE: Incorrect number of parameters: expected 207 but found 0." when the query has more than...
0
answers
0
votes
73
views
asked 13 days agolg...
AWS Glue Job Errorlg...
Im trying to convert CSV files in S3 to Parquet in another S3 bucket. So first I read the CSV files using a crawler, load the data into a Table, and then use a Job to convert from the Table to S3 in...
0
answers
0
votes
301
views
asked 13 days agolg...
Hello,
We set up AWS DMS, where the source is MS SQL Server 2019, and the target is S3 (with parquet). Setting up CDC copying. And it is important for us to check that DDLs on source work as well:
1)...
0
answers
0
votes
229
views
asked 18 days agolg...
I've been trying to test out Iceberg tables with Amazon Redshift Spectrum and have come across a major issue.
Here is my setup:
1. I create an iceberg table via spark (emr 7.0) and insert data across...
0
answers
1
votes
456
views
asked 20 days agolg...
I got the error when write glue dynamic data frame to redshift table .
The code I have used :-
my_conn_op={
"dbtable":"public.xxxx",
...
0
answers
0
votes
406
views
asked 21 days agolg...
Hello.
I am trying to configure specific iam permission for an user. I need a permission for only read tables from existing Data Catalog.
So, I have configured this policiy:
```
{
"Version":...
0
answers
0
votes
188
views
asked 22 days agolg...