1 réponse
- Le plus récent
- Le plus de votes
- La plupart des commentaires
0
I figured it out. It turns out that the examples didn't spell out that you need to convert the Python model back from the java model, and you can't call transform() right on the dataframe. Complete code below.
from sagemaker_pyspark import SageMakerModel
from sagemaker_pyspark.transformation.serializers import ProtobufRequestRowSerializer
from sagemaker_pyspark.transformation.deserializers import KMeansProtobufResponseRowDeserializer
rowSer=ProtobufRequestRowSerializer(featuresColumnName="features")
smModel = SageMakerModel.fromEndpoint(
endpointName="endpoint-9ad5fcee9c52-2017-12-08T13-36-26-267",
requestRowSerializer=rowSer,
responseRowDeserializer=KMeansProtobufResponseRowDeserializer(
closest_cluster_column_name="cluster",
distance_to_cluster_column_name="closest")
)
ew_model = SageMakerModel._from_java(smModel)
data=SageMakerModel.transform(ew_model,pred)
répondu il y a 6 ans
Contenus pertinents
- demandé il y a 4 mois
- demandé il y a 7 mois
- demandé il y a un an
- AWS OFFICIELA mis à jour il y a 2 ans
- AWS OFFICIELA mis à jour il y a un an
- AWS OFFICIELA mis à jour il y a 2 ans