Hi all,
I am building a generative AI chatbot using AWS Bedrock, AWS OpenSearch Serverless, AWS Lambda, AWS CloudFormation, and AWS S3. Here is the GitHub repository for the code https://github.com/aws-samples/amazon-bedrock-samples/tree/main/rag-solutions/contextual-chatbot-using-knowledgebase
I have tested the chatbot using my own data file. However, it only answers questions that are phrased in a certain way (ie questions that use words that are also used in the data file). For example, my document talks about the feedback my company has received from potential customers. When I ask the chatbot "what feedback have potential customers given my company?", the chatbot responds correctly. But when I ask the chatbot "what have potential customers said about my company?", the chatbot responds with "Sorry, I am unable to assist you with this request".
How do I get the chatbot to understand that questions rephrased differently require the same information from my data file?
Also, I asked the chatbot general knowledge questions (so information not included in my data file, but general questions like "why is the sky blue?") but the chatbot also returned "Sorry, I am unable to assist you with this request". I am confused as I am using the Claude 3.5 Sonnet LLM, so thought the chatbot would be able to answer generic queries that aren't related to the data I am feeding the bot via my S3 bucket. How do I activate the general knowledge capability within the bot, given that Claude 3.5 Sonnet is trained on public data from all over the web and should be able to answer easy questions like that?
Thank you
Thank you Vitor!