All Content tagged with Extract Transform & Load Data

Content language: English

Select tags to filter
Sort by most recent
696 results
I'm testing DataBrew as a no-code ETL option. Unfortunately, even a small toy-job run for over 10 minutes: * Read a 1000 rows dataset from RDS * Filter on a date column and restrict to 2 days -> 20 re...
2
answers
0
votes
41
views
asked 4 days ago
I'm running a simple test from Redshift to Redshift using JDBC connector and Glue ETL. Objective is to test JOB BOOKMARK feature. I don't have a primary key defined in source, so using LoadTimestamp w...
1
answers
0
votes
39
views
asked 9 days ago
I am trying to use the Glue table optimization feature that you find in the Glue console -> Go into any table -> "Enable optimization". The process asks you for an IAM role, for which I created a role...
1
answers
0
votes
44
views
asked 17 days ago
put a csv file into SageMaker data lake and when I go to preview it I receive this error
1
answers
0
votes
32
views
asked 24 days ago
We are utilizing AWS Glue as our ETL engine to transfer data from on-premises and cloud databases to an S3 bucket. Currently, we have a Lambda function that triggers a Glue job, but each time we invok...
2
answers
0
votes
105
views
asked 24 days ago
Error Category: UNCLASSIFIED_ERROR; Failed Line Number: 245; An error occurred while calling o147.sql. Insufficient LF permission(s) on chorus (Service: Glue, Status Code: 400, Request ID: 0054e9fa-5d...
2
answers
0
votes
51
views
asked a month ago
I've created a Glue ETL job with a Redshift table as the source node and a dropDuplicates transform over some specific keys (not the entire row, because they have created timestamps). I understand the...
1
answers
0
votes
40
views
profile picture
asked a month ago
I am using AWS Database Migration Service (DMS) to migrate data from MongoDB (source) to Amazon S3 (target). The migration includes both Full Load (FL) and Change Data Capture (CDC). However, after so...
1
answers
0
votes
80
views
asked 2 months ago
I have a DMS job that writes records from postgres to s3 folder in parquet format. I want, for compliance reasons, to delete records from the parquet file when the records are deleted from source. I k...
1
answers
0
votes
48
views
asked 2 months ago
Hi AWS Experts, I am working on an AWS Glue job to transfer data from an on-premise SQL Server to AWS RDS (SQL Server). My goal is to achieve a 20-second execution time, but despite multiple optimiza...
1
answers
0
votes
86
views
asked 2 months ago
![Custom regex pattern](/media/postImages/original/IMLkdhYjnCSrmf2YAg4OGJJQ) I get error below when I try to use a custom regex pattern to detect sensitive data in Glue Studio Error Category: UNCLASS...
2
answers
0
votes
58
views
asked 2 months ago
Can I enable optimization in AWS Glue when I create an iceberg table? Is there an option like a tableProperty? df.writeTo(f"{table_name}") \ .tableProperty("format-version", "...
1
answers
0
votes
111
views
asked 2 months ago