AWS-Bedrock - Knowledge- Base

0

Hi I am fairly knew to using AWS and I need some assistance using Bedrock Services. I would like to get some assistance on:

  • When using knowledge base can we set the system prompt to allow the LLM to generate from its pretrained data with hallucinating
  • When using Lang chain with bedrock are we able to use conversation summary were it summarizes the users query after each request
2 Answers
0
Accepted Answer

Thank you for your questions about AWS Bedrock and its Knowledge Base feature. I'll address each of your points:

  1. Regarding setting the system prompt to allow the LLM to generate from its pretrained data without hallucinating:

When using Amazon Bedrock Knowledge Bases, the primary goal is to ground the LLM's responses in the information contained within your knowledge base, rather than relying solely on the model's pretrained data. The knowledge base is designed to provide context and factual information to the LLM, which helps reduce hallucinations.

However, it's important to note that you don't have direct control over a "system prompt" in the same way you might with other LLM implementations. Instead, Bedrock manages the interaction between the knowledge base and the LLM internally. The focus is on using the Retrieval Augmented Generation (RAG) technique to enhance the model's responses with relevant information from your data sources.

  1. Regarding the use of conversation summary with LangChain and Bedrock:

While Bedrock does offer some conversation context management features, the specific functionality you're asking about (summarizing the user's query after each request) is not a built-in feature of Bedrock itself. However, you may be able to implement this kind of functionality using LangChain in combination with Bedrock.

LangChain is a flexible framework that allows you to build custom chains and agents. You could potentially create a chain that includes a summarization step after each user query. This would be separate from Bedrock's internal context management but could be integrated into your overall application flow.

It's worth noting that Bedrock does provide some conversation context management. When using the RetrieveAndGenerate API, Bedrock manages short-term memory of the conversation to provide more contextual results. However, this is handled internally by the service and is not the same as explicitly summarizing each query.

For more advanced conversation management or custom summarization, you would likely need to implement additional logic in your application, possibly using LangChain's capabilities in conjunction with Bedrock's APIs.

Sources
Knowledge bases for Amazon Bedrock - AWS Prescriptive Guidance
How Amazon Bedrock knowledge bases work - Amazon Bedrock
Amazon Knowledge Bases for Bedrock Conversation Context | AWS re:Post

profile picture
answered 3 days ago
profile picture
EXPERT
reviewed 3 days ago
0

So with the system prompt am I not able to edit it and send it using the api, because i see when i am in the Test Knowledge Base stage I can?

answered 3 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions