Schema incorrectly showing Data type of array in Glue Catalog when using Delta Lake table

0

I have a Delta Lake table saved in s3. I am running the following command:

spark.sql(""" CREATE EXTERNAL TABLE db.my_table USING DELTA LOCATION 's3://path/to/delta/table """)

Everything seems to work fine except when I look at the Schema in Glue Catalog it shows 1 field with column name of "col" and data type of "array". It should have two fields first_name and last_name that are both strings.

It populates correctly using a crawler but I have been asked to provide an alternative solution. How can this be done?

temp999
preguntada hace 4 meses339 visualizaciones
2 Respuestas
1
Respuesta aceptada

When creating the table using Spark SQL, although the Glue table may not correctly reflect the table schema, however, SQL queries on the table should work fine as the schema is referenced from the metadata present in the table’s S3 location.

If you would like the table schema to be populated in the Glue catalog table, you may consider creating the Delta Lake table using an Athena query. Athena infers the Delta Lake table metadata from the Delta Lake transaction log and synchronizes it with the Glue catalog. Please see the following document on how to create Delta Lake tables using Athena: https://docs.aws.amazon.com/athena/latest/ug/delta-lake-tables.html#delta-lake-tables-getting-started

Please note that there are no charges for DDL queries in Athena.

AWS
INGENIERO DE SOPORTE
respondido hace 4 meses
1

It is a known limitation of the library: https://github.com/delta-io/delta/issues/1679
As Davlish points there are alternatives so it shouldn't be a blocker

profile pictureAWS
EXPERTO
respondido hace 4 meses

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas