Importing "openai-whisper" in Python Lambda Function OSError

0

Hi all, I want to import the openai whisper module (https://github.com/openai/whisper) into a Python Lambda Function . This package is large (4GB), so I had to attach an EFS file system to the Lambda function. All right until I test the function and I'm getting this error when trying to import the whisper module.

[ERROR] OSError: /mnt/ddv/ddv/nvidia/cufft/lib/libcufft.so.10: failed to map segment from shared object Traceback (most recent call last):   File "/var/lang/lib/python3.9/importlib/__init__.py", line 127, in import_module     return _bootstrap._gcd_import(name[level:], package, level)   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load   File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked   File "<frozen importlib._bootstrap>", line 680, in _load_unlocked   File "<frozen importlib._bootstrap_external>", line 850, in exec_module   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed   File "/var/task/lambda_function.py", line 9, in <module>     import whisper   File "/mnt/ddv/ddv/whisper/__init__.py", line 8, in <module>     import torch   File "/mnt/ddv/ddv/torch/__init__.py", line 228, in <module>     _load_global_deps()   File "/mnt/ddv/ddv/torch/__init__.py", line 189, in _load_global_deps     _preload_cuda_deps(lib_folder, lib_name)   File "/mnt/ddv/ddv/torch/__init__.py", line 155, in _preload_cuda_deps     ctypes.CDLL(lib_path)   File "/var/lang/lib/python3.9/ctypes/__init__.py", line 374, in __init__     self._handle = _dlopen(self._name, mode)

Anyone knows how to resolve this error? Thanks in advance

asked a year ago919 views
2 Answers
0

Does this module work without a GPU in the execution environment?
If a GPU is required, it would not be available in Lambda.

profile picture
EXPERT
answered a year ago
  • Hi Riku, how can I check if this module work without a GPU in the Lambda execution environment?

  • A good place to start would be to create a sample Python script and see if it works on an EC2 without a GPU.

  • did you ever get this working? We are attempting to do something similar, and would be happy to work with you to help debug this and see if we can get this working as a lambda function. I've gotten Whisper working inside an Anaconda Notebook on an M2 Macbook Air, but that does have a built in GPU, and I'm not entirely sure on what hardware the code is executing. Would love to get this deployed to Lambda.

0

Hi diegodavila,

It seems your code uses pytorch and also requires GPU support as well. That's why you are getting an error related to cuda. Please check whether you can run your code without GPU support.

Currently Lambda runtime does not support GPU. If you do require GPU, you can consider using containers services such as Amazon Elastic Container Service (Amazon ECS), Kubernetes, or deploy the model to an Amazon SageMaker endpoint.

profile picture
Bumuthu
answered a year ago
  • Hi Bumuthu, how can I check if I can run this Lambda Function without GPU support?

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions