Questions tagged with Extract Transform & Load Data
Content language: English
Select up to 5 tags to filter
Sort by most recent
Browse through the questions and answers listed below or filter and sort to narrow down your results.
I'm trying to customize a script built via **Visual ETL** and came across a error
Steps I followed
* A DynamicFrame created from csv file present in s3 needs to be evaluated for its data-type
* I'm...
1
answers
0
votes
283
views
asked 5 months agolg...
Hi all, I had a quick question regarding AWS Glue Catalog Tables.
I currently have a crawler that is crawling one folder with 3 files of the same schema. If I want to reduce to just 1 file, would I...
1
answers
0
votes
370
views
asked 5 months agolg...
Hi all, is there any direct way to set up cloudwatch alarms to alert you when a glue job fails? Without using the lambda function. For example, using a direct metric such as...
2
answers
0
votes
740
views
asked 5 months agolg...
During mapping of a few columns from a CSV file in S3 to a Redshift table, I can't change datatype from long to int. The list is empty, and the search is not showing any results either. In the catalog...
2
answers
0
votes
277
views
asked 5 months agolg...
Hey all, I currently have a data file in s3 which i am trying to output to redshift through glue visual studio. My data successfully got sent to redshift tables, but the amount field has changed the...
2
answers
0
votes
236
views
asked 5 months agolg...
Hi, I'm relatively new to AWS glue and i'm having trouble with Glue ETL error.
What makes it strange is that this error is only on Dev env but not on Test env. Same code & configuration!
Also tried to...
1
answers
0
votes
269
views
asked 5 months agolg...
Hey all, I have been trying to perform a simple S3 to redshift data push using an S3 source node and a Amazon Redshift target node.
I have been getting errors such as
'Failed to connect to IP...
4
answers
0
votes
342
views
asked 5 months agolg...
I have a MySQL database in AWS RDS. I want to import bulk data in to table in database. which contains thousands of rows.
I need to do it from my website with also deployed in AWS. Web is developed...
1
answers
0
votes
233
views
asked 5 months agolg...
I am building a data pipeline to Load data into Redshift from an S3 data lake.
Data are stored in Parquet format on S3 and I would like to load them into the respective Redshift tables using an AWS...
1
answers
0
votes
411
views
asked 6 months agolg...
I need to create a aes encryption ECB UDF in the redshift. To achieve this, i have imported pycryptodome zip file in the S3 with the name Crypto.zip and create library in the redshift.
When i try to...
1
answers
0
votes
257
views
asked 6 months agolg...
Scenario:
Source table: Glue Data Catalog table **study** crawled from MySQL with columns:
* id (int),
* code (varchar),
* desc (varchar)
* and 2 other columns not used in the job.
Target table:...
0
answers
0
votes
99
views
asked 6 months agolg...
I want to add Confluent Cloud Apache Kafka as a Data source in AWS ETL job to read data stream from Kafka topic.
I created a cluster, topic, AWS SQS source connector and AWS S3 sink connector in...
1
answers
0
votes
331
views
asked 6 months agolg...