Bedrock unveiled: A Quick Lambda example

4 minute read
Content level: Intermediate
2

Understanding basics of Amazon Bedrock and its model consumption via Lambda functions, as part of an Amplify-based AI assistant app

Introduction

One of the most awaited Re:Invent 2022 releases is finally Generally Available (GA): Amazon Bedrock, a fully managed service to build generative AI applications with foundation models (FMs).

This article shows how to enable Bedrock FMs and utilize them, for example, in Lambda functions. The use case is part of an AI assistant I created in a previous article

The main difference with previous app is in the usage of the AI21Labs Jurassic model (via Bedrock) over the plain OpenAI model, and the resulting architecture will look like this:

Revisited architecture for the AI speaker assistant

Setup Bedrock

Bedrock provides a set of pre-defined FMs, which you can use to query against:

Currently supported FMs

To use such FMs in Bedrock, you first must enable them:

Manage model access

  • The next page shows a list of available models:

FMs mode list

  • You enable models by editing and selecting the preferred FMs, such as Anthopics or A21Labs’ Jurassic, the one I chose for this tutorial.

FMs model selection

  • Once saved, it will take a few minutes until the model is available to be consumed.

FM enablement in progress

  • Once the model is available in Bedrock, you can use it to play around in the AWS console with Chat, Text and *Image sections, *depending on the requested model. Below is an example of inferring questions from the purchased model:

Chat playground

All good, but let’s now understand how to use Bedrock APIs to infer models in Lambda functions.

Invoke model in Lambda

Bedrock provides API integrations with CLI, and SDK, making consumption of FMs very easy, let’s see how.

  • Create a Python 3.9 Lambda

  • You must set the boto3 dependency explicitly, as the default version does not contain the Bedrock service yet. As I used Amplify (via *amplify add function CLI) *to generate the function, I set the explicit version as boto3 = “1.28.57” in PipFile, and similarly can be achieved with the requirements.txt file.

  • Lambda code, explained in inline comments, is the following:

    import json
    import boto3
    
    # Bedrock client used to interact with APIs around models
    bedrock = boto3.client(
     service_name='bedrock', 
     region_name='us-east-1'
    )
     
    # Bedrock Runtime client used to invoke and question the models
    bedrock_runtime = boto3.client(
     service_name='bedrock-runtime', 
     region_name='us-east-1'
    )
    
    def handler(event, context):
    
     # Just shows an example of how to retrieve information about available models
     foundation_models = bedrock.list_foundation_models()
     matching_model = next((model for model in foundation_models["modelSummaries"] if model.get("modelName") == "Jurassic-2 Ultra"), None)
    
     prompt = json.loads(event.get("body")).get("input").get("question")
    
     # The payload to be provided to Bedrock 
     body = json.dumps(
       {
          "prompt": prompt, 
          "maxTokens": 200,
          "temperature": 0.7,
          "topP": 1,
       }
     )
     
     # The actual call to retrieve an answer from the model
     response = bedrock_runtime.invoke_model(
       body=body, 
       modelId=matching_model["modelId"], 
       accept='application/json', 
       contentType='application/json'
     )
    
     response_body = json.loads(response.get('body').read())
    
     # The response from the model now mapped to the answer
     answer = response_body.get('completions')[0].get('data').get('text')
     
     return {
       'statusCode': 200,
       'headers': {
         'Access-Control-Allow-Headers': '*',
         'Access-Control-Allow-Origin': '*',
         'Access-Control-Allow-Methods': 'OPTIONS,POST,GET'
       },
         'body': json.dumps({ "Answer": answer })
       }
    }
  • The Lambda execution role must allow actions to Bedrock resources, such as bedrock:ListFoundationModels and bedrock:InvokeModel. You can create them as custom policies in Console, CLI or IaC. As the Lambda will be part of an Amplify project, I updated the custom-policies.json auto-generated file:
    [
     {
       "Action": [
         "bedrock:ListFoundationModels"
       ],
       "Resource": [
          "*"
       ]
     },
     {
       "Action": [
         "bedrock:InvokeModel"
       ],
       "Resource": [
         "arn:aws:bedrock:us-east-1::foundation-model/ai21.j2-ultra"
       ]
     }
    ]

The AI Assistant

The assistant can be found at this address. Full details of OpenAI implementation can be found here. It is an Amplify-based web application that allows end-users to ask questions by voice to our new Bedrock/Jurassic-based model assistant, Johanna, which will reply with its voice. As mentioned, the main difference between the previous implementation and now is just the usage of the above-described Lambda function integrating with Bedrock FMs, that is it.

Conclusion

As a first impression, Bedrock looks more like an ecosystem of FMs that seamlessly expose them for consumption.

It presents a very intuitive approach to exposing these models drastically lowering the barrier for new entries in the domain.

Governance and privacy are crucial, and I am very curious to explore how that gets stretched in regulatory environments such as finance.

The next step will be to combine Bedrock model with Langchain so that you can augment the FMs with your data, so watch out for this space and my Linkedin and stay tuned for the next article.