- Neueste
- Die meisten Stimmen
- Die meisten Kommentare
One possible cause of this issue is that the PySpark kernel does not have access to the required Python libraries. In order to install additional libraries on the PySpark kernel, you need to ensure that the libraries are available on the EMR cluster. Here are some steps you can take to install additional libraries on the PySpark kernel: Install the libraries on the EMR cluster using the pip command. For example:
!pip install pandas==0.25.1 Make sure that the libraries are available on all the worker nodes in the EMR cluster. You can do this by adding the libraries to the PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON paths in the spark-env.sh file on the worker nodes.
Restart the PySpark kernel in JupyterHub to make the libraries available to the PySpark kernel.
For a complete guide on how to install additional kernels and libraries on EMR Jupyert hub please read the documentation page here
Relevanter Inhalt
- AWS OFFICIALAktualisiert vor einem Monat
- AWS OFFICIALAktualisiert vor 2 Jahren
- AWS OFFICIALAktualisiert vor 3 Jahren
- AWS OFFICIALAktualisiert vor 2 Jahren