Pricing for AWS Bedrock knowledge bases

0

Hi, is there a specific documentation about AWS bedrock knowledge bases pricing ?

data source -> web crawler.

Content chunking and parsing -> default.

Embeddings model - > Embed English v3.

Vector database -> Quick create a new vectore store - Recommended.

Please, thank you

3 Answers
2
Accepted Answer

Hi Jay,

There isn’t a specific charge for using the Bedrock knowledge bases themselves, but you will incur costs for the AI models and vector databases associated with them. Here’s a breakdown:

1.AI Models: You'll be charged based on the AI models you use. Pricing details are available here: AWS Bedrock Pricing https://aws.amazon.com/bedrock/pricing/?nc1=h_ls

2.Vector Databases: If you're using OpenSearch Serverless as part of your setup, you’ll need to consider its costs. Pricing details can be found here: OpenSearch Service Pricing https://aws.amazon.com/opensearch-service/pricing/?nc1=h_ls#Amazon_OpenSearch_Serverless

The overall cost will be the sum of the vector store and the model inferences. You can get a more precise estimate by looking at the input and output tokens during model inference, which will help you calculate costs based on usage.

profile picture
EXPERT
answered a month ago
profile picture
EXPERT
reviewed a month ago
2

Hello.

There is no charge for the Bedrock knowledge bases feature itself, but there is a charge for the AI ​​models and vector databases used.
AI model pricing is listed in the document below.
https://aws.amazon.com/bedrock/pricing/?nc1=h_ls

In your case, I think you are using OpenSearch Serverless, which is created together with the knowledge base, so you should check the price list in the document below.
https://aws.amazon.com/opensearch-service/pricing/?nc1=h_ls#Amazon_OpenSearch_Serverless

profile picture
EXPERT
answered a month ago
profile picture
EXPERT
reviewed a month ago
profile picture
EXPERT
reviewed a month ago
2

Hi,

Documentation says clearly: "When using Agents for Amazon Bedrock and Knowledge Bases for Amazon Bedrock, you are only charged for the models and the vector databases you use with these capabilities"

See https://aws.amazon.com/bedrock/pricing/?nc1=h_ls to confirm.

So, you have to select a vector store based on the list you see at https://docs.aws.amazon.com/bedrock/latest/userguide/knowledge-base-setup.html

The usual cost of vector stores has 2 components: size of vector data + number of requests to the store

Then, you also have to select an LLM among the supported ones on Bedrock: see https://docs.aws.amazon.com/bedrock/latest/userguide/knowledge-base-supported.html Each of these models have different costs: see https://aws.amazon.com/bedrock/pricing/?nc1=h_ls

The cost of your setup will be the sum of cost of vector store + cost of LLM inferences

To estimate cost of inferences, you can obtain the number of input and output tokens in the response metadata returned by the LLM: so, make some trials with representative queries of your future queries to compute how much they will cost in aggregate.

Best,

Didier

profile pictureAWS
EXPERT
answered a month ago
profile picture
EXPERT
reviewed a month ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions