2 Answers
- Newest
- Most votes
- Most comments
1
Try importing specific function instead of import *. For example, "from pyspark.sql.functions import split" to import split function.
I tried replicating your problem, it complained that import * can only be used at module level. But when I changed to specific function, it worked.
Hope this help.
answered 2 years ago
0
Hi ,
yes, any library you need for your custom transform should be imported within the function.
just to consider if you want to run SparkSQL you could also use the SQL transform.
hope this helps,
Relevant content
- asked a year ago
- Accepted Answerasked 3 years ago
- AWS OFFICIALUpdated 3 years ago
- How can I use a Lambda function to automatically start an AWS Glue job when a crawler run completes?AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated a year ago
so is it a best practice to run the imports within the function fo the custom transform?