1 Answer
- Newest
- Most votes
- Most comments
1
Hello.
Use the retrieve_and_generate() API to query the Bedrock knowledge base.
https://docs.aws.amazon.com/bedrock/latest/userguide/kb-test-how.html
The boto3 documentation is below.
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/bedrock-agent-runtime/client/retrieve_and_generate.html
The code below is a sample.
import boto3
import json
bedrock_agent_runtime_client = boto3.client('bedrock-agent-runtime')
def lambda_handler(event, context):
user_prompt = event.get('user_prompt')
knowledge_base_id = '<Knowledge Base ID>'
modelArn = 'arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-v2'
prompt = f"""\n\nHuman:
Please answer [question] appropriately.
[question]
{user_prompt}
Assistant:
"""
response = bedrock_agent_runtime_client.retrieve_and_generate(
input={
'text': prompt,
},
retrieveAndGenerateConfiguration={
'type': 'KNOWLEDGE_BASE',
'knowledgeBaseConfiguration': {
'knowledgeBaseId': knowledge_base_id,
'modelArn': modelArn,
}
}
)
print("Received response:" + json.dumps(response, ensure_ascii=False))
response_output = response['output']['text']
return response_output
Relevant content
- asked 2 months ago
- asked 3 months ago
- Accepted Answerasked 5 months ago
- AWS OFFICIALUpdated 4 months ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 5 months ago
Hi, Riku is fully right: I also use retrieve_and_generate in my current project as well. ModelArn parameter will allow you to select the LLM that you want / need.