Questions tagged with Extract Transform & Load Data
Content language: English
Select up to 5 tags to filter
Sort by most recent
Browse through the questions and answers listed below or filter and sort to narrow down your results.
Hi everyone, I was trying to ingest csv data to Timestream Db with AWS SDK boto3.M y credential , region , Database_Name, Table_Name are all correct but still I am unable to connect to endpoint of my...
0
answers
0
votes
78
views
asked 10 months agolg...
Json data is being considered as string while loading data from postgres to json file by AWS Gluelg...
I want to migrate postgres data to redshift, but I have a lot of jsonb data in postgres so for that I had given SUPER data type in Redshift but the problem here is while loading the data to redshift...
1
answers
0
votes
994
views
asked a year agolg...
The setup we currently have is a kafka cluster with various topics; 1 topic per data structure, and a default consumer that runs on all of these topics. The consumer simply takes the persisted data...
2
answers
0
votes
262
views
asked a year agolg...
I signed up for the free trial, however, while trying to run ETL Jobs for learning, I was charged for AWS Glue Jobrun. It was a mistake since it is a thing that I cannot oversee. I tried to run the...
2
answers
0
votes
268
views
asked a year agolg...
I am using Data Quality to evaluate the dataset and I am routing the failed rules and failed records. But when i check the s3 bucket for the failed records i am seeing empty files along with the...
0
answers
0
votes
114
views
asked a year agolg...
Hi, I'm running a data pipeline from legacy DB(oracle) to Redshift using AWS Glue.
I want to test the connection to the legacy DB before executing ETL without test query in working python...
1
answers
0
votes
269
views
asked a year agolg...
Can't load the datalg...
Can't load the data from server and what can I do now ?
1
answers
0
votes
178
views
asked a year agolg...
I am trying to create an insert into statement which queries a source table, which has a column containing a STRUCT data-type. This table is automatically created via crawler and the column definition...
1
answers
0
votes
467
views
asked a year agolg...
I've been setting an AWS Glue test environment in which business users should use tables available in Glue catalog as data sources for their Glue jobs. The tables in the catalog come from data marts...
2
answers
0
votes
1168
views
asked a year agolg...
Hello,
I have a Glue Pyspark job that writes to Amazon S3 in a CSV format, I don't see any file with CSV extension on Amazon S3. How can see it ?
I am using `write_dynamic_frame.from_options`...
3
answers
0
votes
272
views
asked a year agolg...
Hi!
I'm having some issue processing a DynamoDB export in Glue. It looks like my schema is breaking DynamicFrame's unnestDDBJson transformation.
This is the minimal test case that will trigger the...
1
answers
0
votes
436
views
asked a year agolg...
Using the step: "Derived Column" where I can use SQL functions to derive new columns I found SUBSTRING works but CHARINDEX it doesn't. And as usual with AWS documentation, there's no place or...
1
answers
0
votes
357
views
asked a year agolg...