Enable Amazon Bedrock in 3rd Party GenAI Tools and Plug-ins

4 minute read
Content level: Intermediate
0

Enable Amazon Bedrock in tools that don’t natively support it.

Today, generative artificial intelligence (GenAI) is everywhere and there are numerous tools that provide plug-ins to integrate with widely used consumer facing GenAI chat applications. The downsides are that each of these services have limited free ties or charge a monthly fee (some as high as $200/month) and have limited model options which may not apply well to your many use cases. Amazon Bedrock, on the other hand, has an incredible array of the best open source and proprietary models as well as low token-based pricing for all of them. This allows you to choose the best model for your task and pay for only the number of tokens that you use. Unfortunately, most of the tools with plug-ins focus only on the consumer based model APIs and don’t offer the ability to connect to Amazon Bedrock.

That has all changed now that AWS has published an OpenAI-compatible RESTful APIs for Amazon Bedrock called Bedrock Access Gateway. This allows you to configure many tools that support connection to OpenAI’s ChatGPT to now connect to Amazon Bedrock and utilize all of its available models.

In the GitHub page for Bedrock Access Gateway there are instructions for deploying a simple but scalable architecture for hosting the gateway based on an Application Load Balancer (ALB) and Amazon Lambda or AWS Fargate. These are great options to support a team or a small business. But what if you just want to enable a single user with the power of Bedrock from their laptop? From this same codebase you can build a Docker container that hosts the gateway and allows you connect your tools to all of the models Bedrock offers. Let’s get started.

Architecture Diagram

Prerequisites:

  • Workstation with Docker Desktop installed.
  • AWS account with Bedrock model access enabled.
  • IAM user with API Key and bedrock:* permissions

Clone the GitHub repo to your local workstation.

git clone https://github.com/aws-samples/bedrock-access-gateway.git

Now let’s build a container that runs the Bedrock Access Gateway. (If you would rather start with a pre-built container you can docker pull ibehren1/bedrock-gateway:latest which supports both x86_64 and ARM64.)

To build the container image run the following commands:

cd bedrock-access-gateway/src

# We want to build from the Dockerfile_ecs.
# Build the container and tag as bedrock-gateway
docker build . -f Dockerfile_ecs -t bedrock-gateway

Now let’s run the container and pass some values (AWS API key/secret and Region) for connecting to Bedrock. (Replace the placeholders with your actual key and secret.)

docker run \
	-e AWS_ACCESS_KEY_ID=<access key id> \
	-e AWS_SECRET_ACCESS_KEY=<access key secret> \
	-e AWS_REGION=us-east-1 \
	-p 8000:80 \
	-d bedrock-gateway

If you would rather use Docker Compose to run the container, here is a compose file that accomplishes the same thing.

services:
  bedrock-gateway:
    image: bedrock-gateway
    container_name: bedrock-gateway
    environment:
      - AWS_ACCESS_KEY_ID=<access key id>
      - AWS_SECRET_ACCESS_KEY=<access key secret>
      - AWS_REGION=us-east-1
    ports:
      - 8000:80/tcp
    restart: unless-stopped

At this point you have a container running that is listening on port 8000. The container runs a restful API that matches the OpenAI spec. When it receives requests, it translates the requests to Amazon Bedrock calls and proxies them to Bedrock for inference. You can verify that the container is running by pointing your browser to http://localhost:8080/docs and you should be presented with the Swagger page describing the Bedrock Access Gateway’s spec.

To configure tools to use your local Bedrock Access Gateway, you will need to use the following parameters:

Base URL: http://localhost:8000/api/v1
API Key:  bedrock

As an example, lets configure an open source plug-in called TextGen to enable use of Bedrock in Obsidian. Within the configuration dialog select OpenAI Chat, fill in the API Key with the value bedrock and point the Base Path to http://localhost:8000/api/v1.

Enter image description here

Next click on the refresh button next to model and then click on the dropdown where you should find all of the models that you have enabled in Bedrock.

Enter image description here Enter image description here

Each tool that you configure will be a little different but the two values that you need to configure are the Base URL and the API Key.

Bedrock Pricing: https://aws.amazon.com/bedrock/pricing/

profile pictureAWS
EXPERT
published 3 months ago500 views