Schema incorrectly showing Data type of array in Glue Catalog when using Delta Lake table

0

I have a Delta Lake table saved in s3. I am running the following command:

spark.sql(""" CREATE EXTERNAL TABLE db.my_table USING DELTA LOCATION 's3://path/to/delta/table """)

Everything seems to work fine except when I look at the Schema in Glue Catalog it shows 1 field with column name of "col" and data type of "array". It should have two fields first_name and last_name that are both strings.

It populates correctly using a crawler but I have been asked to provide an alternative solution. How can this be done?

temp999
asked 4 months ago279 views
2 Answers
1
Accepted Answer

When creating the table using Spark SQL, although the Glue table may not correctly reflect the table schema, however, SQL queries on the table should work fine as the schema is referenced from the metadata present in the table’s S3 location.

If you would like the table schema to be populated in the Glue catalog table, you may consider creating the Delta Lake table using an Athena query. Athena infers the Delta Lake table metadata from the Delta Lake transaction log and synchronizes it with the Glue catalog. Please see the following document on how to create Delta Lake tables using Athena: https://docs.aws.amazon.com/athena/latest/ug/delta-lake-tables.html#delta-lake-tables-getting-started

Please note that there are no charges for DDL queries in Athena.

AWS
SUPPORT ENGINEER
answered 4 months ago
1

It is a known limitation of the library: https://github.com/delta-io/delta/issues/1679
As Davlish points there are alternatives so it shouldn't be a blocker

profile pictureAWS
EXPERT
answered 4 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions