Interactive message sent in a response to a fulfilled intent breaks AWS Connect flow if proxy Lex v2 bot is created after August 17th

0

On August 17, 2022 Amazon Lex V2 released a change to the way that conversations are managed with the user. It looks the change breaks AWS Connect flow if the underlying Lambda function sends an interactive message in a response to a fulfilled intent.

Here's the AWS function Lambda function I use for testing:

const isNextStepElicitIntent=true;
const useCustomPayload=true;

exports.handler = async (event) => {
    console.log("Request " + JSON.stringify(event));
    
    const customPayload={
        "templateType":"ListPicker",
        "version":"1.0",
        "data":{
            "content":{
                "title":"Interactive Test",
                "subtitle":"Tap to select option",
                "elements":[
                    {"title":"Option 1"},
                    {"title":"Option 2"}
                ]
            }
        }
    };
                
    var response;
    var message;
    
    if(useCustomPayload){
        message={
            "contentType": "CustomPayload",
            "content": JSON.stringify(customPayload)
        };
    }
    else{
         message={
            "contentType": "PlainText",                
            "content":"Help response"
         };
    }
    
    
    if (isNextStepElicitIntent){
        response = {
            "sessionState": {
                "sessionAttributes": {},
                "dialogAction": {
                    "type": "ElicitIntent"
                }
            },
            "messages": [
                message
            ]
        };
    }
    else{
        response = {
            "sessionState": {
                "sessionAttributes": {},
                "dialogAction": {
                    "type": "ElicitSlot",
                    "slotToElicit": "DummySlot"
                },
                "intent": {
                    "name": "Help",
                    "slots": {
                        "DummySlot": null
                    }
                }
            },
            "messages": [
                message
            ]
        };
    }
    
    console.log("Response: " + JSON.stringify(response));
    return response;
};

Use a Lex v2 bot created before August 17th, add a "Help" intent with "help" as an utterance, and a "DummySlot" to the intent, and check "Use a Lambda function for initialization and validation" in the "Code hooks" section. Connect the preferred alias/language combination of the Lex bot with the test Lambda function. Add the bot to a AWS Connect flow using a "Get customer input" block, and give flow a try with "Help" utterance. You will get the expected interactive message in the response. Do the same with a Lex v2 bot created after August 17th. AWS Connect goes to the "Error" output branch. In both cases, the Lex bot works fine when tested alone, so the problem is in Lex-Connect interaction.

You may try changing isNextStepElicitIntent and useCustomPayload constants at the top of the Lambda function. It appears that everything works fine if the the response is plain text (const useCustomPayload=false;) , or if the dialogAction.type is "ElicitSlot" (const isNextStepElicitIntent=false;). The problem/bug appears only if the dialogAction.type is "ElicitIntent", the content of the response is an interactive message, and the Lex v2 bot is created after August 17th.

2 Answers
1
Accepted Answer

Hi, Thanks for highlighting the issue with custom payload. We have identified the issue and fix is being worked on highest priority. The fix will be available by 9/23.

Saket
answered 3 months ago
  • I confirm the issue is fixed. Thx Saket.

0

The v2 bots I have before that date are in production and sounds like creating a new one will not help. I would open a case with AWS and have them confirm and fix the issue.

profile picture
dmacias
answered 3 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions