1 Answer
- Newest
- Most votes
- Most comments
0
Hi,
If possible for you, the right way to achieve what you want is to use a Knowledge Base for Amazon Bedrock and the service will then limit the answer of the LLM to the strict content of the RAG Knowledge Base via proper prompt engineering.
See all details of this mechanism in my article: https://repost.aws/articles/AR-LV1HoR_S0m-qy89wXwHmw/the-leverage-of-llm-system-prompt-by-knowledge-bases-for-bedrock-in-rag-workflows
Best,
Didier
Relevant content
- asked a month ago