- Newest
- Most votes
- Most comments
Hello Anthony,
Looking at your lambda code, the issue has two main parts:
-
You're using the wrong API method (invoke_model instead of converse or converse_stream). For Prompt Management, you need to use the converse API as documented in the boto3 documentation.
-
The promptVariables structure needs to include a nested "text" object for each variable.
Try this:
import json
import boto3
bedrock = boto3.client(service_name='bedrock-runtime', region_name='us-west-2')
def lambda_handler(event, context):
try:
to_translate = event['selected_text']
containing_sentence = event['containing_sentence']
response = bedrock.converse(
modelId="<prompt_ARN>",
promptVariables={
"word": {
"text": to_translate
},
"sentence": {
"text": containing_sentence
}
}
)
response_text = response['output']['message']['content'][0]['text']
return {
'statusCode': 200,
'body': json.dumps({
'response': response_text
})
}
except Exception as e:
return {
'statusCode': 500,
'body': json.dumps({
'error': str(e)
})
}
To correctly use the prompt variables with a prompt created in Prompt Management, you should structure your request body as follows:
- Remove the "promptVariables" key from the top level of your request body.
- Include your variables directly in the top level of the request body.
Here's how your modified code should look:
request_body = { "word": to_translate, "sentence": containing_sentence } body = json.dumps(request_body) response = bedrock.invoke_model( modelId="<your_prompt_ARN>", contentType="application/json", accept="application/json", body=body )
This structure aligns with how Bedrock expects the variables for a managed prompt to be passed. Each variable you defined in your prompt ({{word}} and {{sentence}}) should be a top-level key in your request body JSON.
Also, ensure that your prompt ARN is correct and that you have the necessary permissions to invoke the model with this prompt.
If you're still encountering issues after making these changes, you may want to double-check the following:
- Verify that the variable names in your request exactly match those in your prompt template.
- Ensure that you're providing values for all required variables defined in your prompt.
- Check that your AWS credentials and region settings are correct.
By making these adjustments, you should be able to successfully invoke your managed prompt using the InvokeModel API.
Sources
Prompt Management for Amazon Bedrock - AWS
Code examples for Amazon Bedrock Runtime using AWS SDKs - Amazon Bedrock
Forming the request body as suggested here results in an error "An error occurred (ValidationException) when calling the InvokeModel operation: Providing attributes other than 'promptVariables' during runtime is not yet supported"
Is there C# equivalent of this code in the latest .NET SDK
Relevant content
- asked 2 months ago
- asked 2 months ago
- asked 5 months ago
- AWS OFFICIALUpdated 20 days ago
- AWS OFFICIALUpdated a year ago
That fixed it, thank you! As an aside, there are a couple of places in the docs that should really be updated to cover this: the invokeModel page in the API reference still states "If you use a prompt created through Prompt management, specify the ARN of the prompt version. For more information, see Test a prompt using Prompt management." Additionally, the Using the Converse API page in the docs shows the promptVariables property without the needed
text
keys.Is there C# equivalent of this code in the latest .NET SDK