- Mais recentes
- Mais votos
- Mais comentários
Dear Customer,
Thank you for using AWS DeepLens.
Looking at the error stack, it seems to be a syntax error in script mo.py with regards to re.sub(b', question #\d+', '', std_err). As the inference script that is been used for your example is not available, that makes it difficult to narrow the issue and additionally we would not be able to provide code support if the issue is related to the custom script build by user.
I'd recommend to overcome the syntax error and try again, if you still encounter error, I'd recommend you to reach out to AWS Support for further investigation of the issue along with all the details and logs as sharing logs is not recommended to share on this platform.
Open a support case with AWS using the link: https://console.aws.amazon.com/support/home?#/case/create
My understanding is that all MXNet models need to be compiled to XML format to be run on the deeplens device under Intel openvino runtime environment.
All I am doing is calling the optimizer to do this process. This is not my script you can open any python terminal and type the following.
import mo
error, model_path = mo.optimize('model', 512, 512, 'MXNet')
If this is not the correct process please tell me what is.
How to create a lambda inference function
https://www.awsdeeplens.recipes/300_intermediate/330_guess_drawing/332_inference/
AWS DeepLens uses the Intel OpenVino model optimizer to optimize the ML model to run on DeepLens hardware. The following code optimizes a model to run locally:
error, model_path = mo.optimize(model_name, INPUT_WIDTH, INPUT_HEIGHT)
I get the error after calling this function
Conteúdo relevante
- AWS OFICIALAtualizada há 2 anos
- AWS OFICIALAtualizada há 3 anos
- AWS OFICIALAtualizada há 2 anos