By using AWS re:Post, you agree to the Terms of Use
/Appsync resolver conditional update of AWSJSON attribute/

Appsync resolver conditional update of AWSJSON attribute

0

I have a mutation for an object where I have multiple non-required fields that may get updated in a single call. One of the attributes, data, is defined as AWSJSON in the schema. To build the update I check supplied values with statements like this:
#if( !$util.isNull(${context.arguments.input.data}) )

If there is data for the attribute the necessary values are added to maps to build the final SET expression, map attribute names #data => data (needed for attributes with names of reserved words), and values :data => value.

The final update uses $utils.toJson to apply the constructed maps. The problem is, when I supply data for the AWSJSON attribute, I get this error:

"Expected JSON object for attribute value '$[update][expressionValues][:data]' but got 'STRING' instead."

All other attributes work as long as the JSON attribute is not supplied.

However, if I instead supply the expressionValues inline, not using $utils.toJson but also not able to conditionally add attributes to the update, it works as expected. Am I using the wrong way to apply the collected expressionValues map to the final update statement, maybe a different $utils method? Any other workaround to be able to deal with the collection of non-required attributes of the mutation? At worst I can make a separate mutation just for the JSON attribute but that's clearly not ideal to make two calls instead of one.

I can make a call from the Appsync Console like this:

mutation UpdateMyItem {
  updateMyItem(input: {
    id: "ITEM-ID-HERE",
    data:"[{\"xyz\": 101}]"
  }) {
    id,
    data
  }
}

Resolver:

{
  "version": "2017-02-28",
  "operation" : "UpdateItem",
  "key" : {
    "id" : $util.dynamodb.toDynamoDBJson($context.arguments.input.id)
  },
  ## Set up some space to keep track of things we're updating **
  #set( $expSet = {} )
  #set( $expNames = {} )
  #set( $expValues = {} )

  ## updatedAt
  #set($now = $util.time.nowISO8601())
  $!{expSet.put("updatedAt", ":updatedAt")}
  $!{expValues.put(":updatedAt", { "S" : "$now"})}

  ## data
  #if( !$util.isNull(${context.arguments.input.data}) )
    $!{expSet.put("#data", ":data")}
    $!{expNames.put("#data", "data")}
    $!{expValues.put(":data", $util.dynamodb.toDynamoDBJson($context.arguments.input.data) )}
  #end

  ## other redacted optional input arguments of various types would be here

  ## build the expression
  #set( $expression = "SET" )
  #foreach( $entry in $expSet.entrySet() )
    #set( $expression = "${expression} ${entry.key} = ${entry.value}" )
    #if ( $foreach.hasNext )
      #set( $expression = "${expression}," )
    #end
  #end

  "update" : {
    "expression": "${expression}",
    "expressionNames": $utils.toJson($expNames),
    ## this fails and results in an error: 
    ## "Expected JSON object for attribute value '$[update][expressionValues][:data]' but got 'STRING' instead."
    "expressionValues": $util.toJson( $expValues )
    ## this works and all attributes are updated
    ##"expressionValues": {
    ##  ":updatedAt" : $util.dynamodb.toDynamoDBJson($now),
    ##  ":data" : $util.dynamodb.toDynamoDBJson($context.arguments.input.data)
    ##}
  }
}
2 Answers
0
Accepted Answer

Hi snewton,

I got your back. You're on the right track here!

tl;dr- change

$!{expValues.put(":data", $util.dynamodb.toDynamoDBJson($context.arguments.input.data) )}

to

$!{expValues.put(":data", $util.dynamodb.toDynamoDB($context.arguments.input.data) )}

.

If you run your resolver code as-is using the Console resolver tester, you'll notice the output of your code is

...
"expressionValues": {
  ":updatedAt": { "S": "2019-07-10T20:35:30.000Z" },
  ":data": "{\"S\":\"[{\\\"xyz\\\": 101}]\"}"
}
...

You'll notice the difference here is that the ":updatedAt" key prints a DynamoDB type object (See: https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference-dynamodb.html#aws-appsync-resolver-mapping-template-reference-dynamodb-typed-values-request), while the ":data" key prints a DynamoDB type object but as a string. This is why you got the error "Expecting JSON object..." because the DynamoDB resolver is expecting a JSOn object to specify the type of value to write to DynamoDB.

If you look at the DynamoDB Resolver Util reference (See: https://docs.aws.amazon.com/appsync/latest/devguide/resolver-util-reference.html#dynamodb-helpers-in-util-dynamodb), you'll see that the method $util.dynamodb.toDynamoDBJson(Object) returns a String, which you save into the expValues map, which then gets passed to $util.toJson(Object) which expects an input object, and will return it as a string which gets printed as output in the vtl. I understand the confusion because there are a few layers here: How you store the values in the expValues map, and the fact that all variables must evaluate to a string to be printed as the output of the VTL (which overall is a string intending to represent a JSON object understood by the DynamoDB data source.

Looking further at the resolver reference, you see $util.dynamodb.toDynamoDB(Object) which returns a Map, the same as the literal map you defined in ":updatedAt", { "S": "$now" }, which will give you the expected output to be printed under expressionValues. I hope this helps clear things up!

answered 3 years ago
0

That may be a better solution.

I had gotten another method working:
$!{expValues.put(":data", $util.parseJson($util.dynamodb.toDynamoDBJson($context.arguments.input.data)) )}

which adds the parseJson layer to get the same result but is almost certainly less efficient.

The solution by @AaronHarris is more clear and also works. Sorry for not updating the thread with my previous answer but this is solved now.

answered 3 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions