Questions tagged with Extract Transform & Load Data
Content language: English
Select up to 5 tags to filter
Sort by most recent
Browse through the questions and answers listed below or filter and sort to narrow down your results.
Hi,
I have 32TB of data stored on S3 Standard bucket and would like to make a copy to my on-premise SQL Server (local machine) via SQL Server Integration Services (SSIS).
(1) I am wondering does it...
2
answers
0
votes
451
views
asked a year agolg...
Hi, i have ran a crawler that connects by jdbc to SnowFlake and it creates a table into the Glue catalog database, that works nice.
now i want to use glue studio to take the source (AWS Glue Data...
1
answers
0
votes
433
views
asked a year agolg...
I am trying to migrate 3 tables and my selection rule looks like this and I get an error
where schema name is like 'EEOP_INTEG' and Source table name is like '[EEOP_USER, EEOP_ORG, EEOP_USER_ORG]',...
2
answers
0
votes
432
views
asked a year agolg...
When I am trying to upload a regular data table from the query, it is not being uploaded to Quicksight. The Full Refresh is giving also problems. When I run the query on the Hubble, it works...
1
answers
0
votes
261
views
asked a year agolg...
Hello everybody,
I'm working for a small retail company, where we want to move our data to aws. Our data can be from different sources (postgresDB, MySQL DB or other unstructured data sources or...
2
answers
0
votes
305
views
asked a year agolg...
Issue importing custom Python modules in Jupyter notebooks with SparkMagic and AWS Glue endpointlg...
I'm encountering an issue while attempting to run ETL scripts in Jupyter notebooks using SparkMagic, which is connected to an AWS Glue endpoint via SSH. I followed the tutorial provided in the AWS...
2
answers
0
votes
670
views
asked a year agolg...
I am trying to create a **Custom Visual transform** but unfortunately facing some issues.
Here my motive it to truncate a **MySQL** table before loading the data into it and I want to do it with the...
1
answers
0
votes
256
views
asked a year agolg...
Hi, I am using G2x type worker, during data write operation glue job allocating only 5 executors and not allocating more executors which makes process getting so long. How to resolve this issue.
3
answers
0
votes
464
views
asked a year agolg...
how to fix this to execute Jobs from Amazon Glue to Redshift? Error: "JobName:s3-redshift and JobRunId:jr_30d8ac0c6b35d44f641b7d5b55819365897b18e6244c8a5559bff53efc8e23c1 failed to execute with...
2
answers
0
votes
244
views
asked a year agolg...
Delivering records to Security Lake custom source prefix is not possible with Kinesis Firehose?lg...
I'm trying to set up custom sources for Security Lake and have a Kinesis Firehose delivery stream configured to deliver parquet files into the Security Lake bucket under the ext/ prefix.
The problem...
1
answers
0
votes
293
views
asked a year agolg...
Hi, i have a crawler that connects by jdbc to a snowflake and the process finishes well, but it doesn't create any table, here the output of the cloudwatch:
BENCHMARK : Classification complete,...
2
answers
0
votes
346
views
asked a year agolg...
At present, Glue supports the iceberg framework, but the MERGE INTO syntax is needed here. I set it according to the official website's article, but I can't succeed. There will always be an error of...
0
answers
0
votes
72
views
asked a year agolg...