Hello,
Is it possible to pass hyperparameters to a PyTorchModel? For instance, in my training script, I would like to load data from the training bucket for use in inference. How would I provide the S3 URI to the inference script from the Estimator and how would it be accessed from within the inference script? Right now I am trying everything from within a SageMaker Notebook Instance. Thanks for any help!
In my notebook I have:
from sagemaker.pytorch.model import PyTorchModel
model_bucket = .......
pytorch_model = PyTorchModel(
model_data= model_bucket,
role=role,
entry_point='inference.py',
py_version="py39",
framework_version="1.13",
)
My inference.py holds:
def model_fn(model_dir):
pass
def input_fn(request_body, request_content_type):
pass
def predict_fn(input_data, model):
pass
def output_fn(prediction, content_type):
pass