How to run a batch transform job with a pre trained .keras model and custom inference code

0

I am new to AWS and I do not understand a lot things here. I have a trained LSTM model in <model-name>.keras format.

I am looking to deploy this model as a batch transformer. I have a custom script to get the predictions. I have tried different versions of the code available online to do this and nothing seems to work.

Can I get some help/advice on how to proceed from here? I need use my custom predictions script to run this as a batch transform job.

Sorry if I don't make sense with my question. Please correct me or ask any questions if anything here makes no sense. Thanks in advance :)

1 Answer
0

Hi,

You can bring your pre-trained model and own inference script using Amazon SageMaker script mode. This essentially allows you to utilise SageMaker's prebuilt containers whilst being able to modify the inference code. You can use Batch Transform with this option, too.

Please check this link for more information: Use Your Own Inference Code with Batch Transform

AWS
answered 3 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions