Questions tagged with Extract Transform & Load Data
Content language: English
Select up to 5 tags to filter
Sort by most recent
Browse through the questions and answers listed below or filter and sort to narrow down your results.
Hi all,
I'm relatively new to Glue, but I've got a Python ETL script that I've built that works pretty well. It reads two CSV files into dataframes and then unions them together into one normalized...
1
answers
0
votes
68
views
asked 3 days agolg...
Hi Team,
I am trying to run SQL statements in Redshift by triggering using EventBridge on S3 file arrival. I am able to run SQLs using Data API but how ever I want to pass the event details to...
1
answers
0
votes
565
views
asked 10 days agolg...
I'd be grateful for a clue on how to craft a connection string for AWS Glue to connect to a SQL Server Always on AG using the Microsoft JDBC Drivers. I'm trying to use this bring your own driver...
2
answers
0
votes
209
views
asked 16 days agolg...
Currently, I am using AWS Glue to extract data from Mongo DB and push alll data to json file in S3 service. I use **create_dynamic_frame** function to extract data from MongoDB and use...
2
answers
0
votes
118
views
asked 21 days agolg...
I'm creating a role in AWS Glue to read CSV files from an S3 bucket. I'm granting full access to S3, but I can't seem to avoid this error. I contacted support, and they suggested increasing the usage...
0
answers
0
votes
66
views
asked 24 days agolg...
Hello, so the context is:
We have a DMS service that sends data from Oracle to s3, after that a job is run to create the raw stage, and after that raw to trusted. The problem is that they changed...
0
answers
0
votes
63
views
asked 25 days agolg...
I am trying to create two DynamicFrames based on a column that is a boolean. I have tried
`dyf.split_rows({'mybool': {'=': 'true'}}, 'is_true', 'is_not_true')`
`dyf.split_rows({'mybool': {'=':...
2
answers
0
votes
88
views
asked 25 days agolg...
AWS Glue Job Errorlg...
Im trying to convert CSV files in S3 to Parquet in another S3 bucket. So first I read the CSV files using a crawler, load the data into a Table, and then use a Job to convert from the Table to S3 in...
0
answers
0
votes
322
views
asked a month agolg...
Is there a way to use AWS SDK or some python script to get all the job names that failed on specific job queue on AWS Batch? After extracting them, I would like to upload this to AWS S3 or RDS so that...
1
answers
0
votes
379
views
asked a month agolg...
I'm having the same issue. Data is stored in below format in s3 as JSON array with partitions
S3 path - s3://fleet-fuelcard-data-import-dev/lambda/fuelsoft-morgan/660306/2024/Apr/03-Apr-2024.json....
1
answers
0
votes
315
views
asked a month agolg...
In AWS Glue jobs, within the Targets node, I am unable to see the data types such as struct, array or map while changing the schema. Does AWS Glue not support these data types?
1
answers
0
votes
188
views
asked a month agolg...
I've successfully set up AWS Glue with an RDS database serving as the data source and a Snowflake database as the data target. In this setup, I've configured AWS Glue crawlers to catalog the metadata...
0
answers
0
votes
448
views
asked a month agolg...