- Newest
- Most votes
- Most comments
ApplyMapping doesn't bubble errors your code might have and that's why probably you get the empty columns.
Try to debug the function using plain Python or catch any exception inside the function and put the message into some string column so you can see it.
I think the alternative you point using DataFrame is easier and more robust.
ApplyMapping casting works for dates that are in the format of one of the ISO variants e.g. 2023-01-08. For custom formats you can convert it to DataFrame and specify the formats as you are already aware of this.
Just posting for your reference: https://sparkbyexamples.com/spark/spark-date-functions-how-to-parse-and-format-date/#Parsing-Date-from-String-object-to-Spark-DateType
As suggested by Gonzalo Herreros, converting it to Dataframe and applying transformation would involve less hassle and robust.
Relevant content
- asked 10 months ago
- asked 2 years ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 2 months ago
Thank you for clarifying