Questions tagged with Amazon DynamoDB
Content language: English
Sort by most recent
Hi,
I'm encountering an issue with the AWS Amplify DataStore save function when updating an object that includes an array property. When I use the DataStore save function to update an object with an array property, and the update input includes a new array value, the array property is not updated correctly and the new value is merged with any existing values.
Example:
Suppose I have an object with an array property called "timeToArrive" that has the value [2, 4]. I use the DataStore save function to update the object, with an update input that includes the new value [1, 1] for the "timeToArrive" property:
const original = await DataStore.query(MyModel, id);
await DataStore.save(MyModel.copyOf(original, updated => {
updated.timeToArrive = [1, 1];
}));
Instead of setting the "timeToArrive" property to [1, 1], the DataStore save function merges the new value with the existing values, resulting in the value [2, 4, 1, 1] for the "timeToArrive" property.
Attempts to fix:
I've tried using different approaches to update the array property, including copying the new values using a for loop or the spread operator, and using a custom resolver to update the object. However, none of these approaches have resolved the issue.
Expected behavior:
I expect the DataStore save function to update the array property correctly, by setting the property to the new value without merging it with any existing values.
Current setup:
AWS Amplify version: 5.0.18
AWS services used: DataStore, DynamoDB
Schema definition for model:
type MyModel @model {
id: ID!
timeToArrive: [Int]
}
Any help or suggestions on how to resolve this issue would be greatly appreciated.
Hi, having an issue with incorrect results returned from a dynamodb scan, either from the console or programatically.
First image shows the correct result for a query (5 rows returned):
https://www.dropbox.com/s/7x6s9q89q7gd5go/dunamodb1.png?dl=0
Second image shows same results from a scan - either in the console or through node.js code (4 rows returned)
https://www.dropbox.com/s/kpk3wus2vlf1ywz/dynamodb2.png?dl=0
There are actually 840 rows in this table if i just move through 50 rows at a time by viewing table details, but a scan only returns 813 rows.
Thx
Hi, we are unable to use DynamoDB's update API because our records are encrypted and signed. To maintain a valid encryption signature, our service has to first GET the record from DynamoDB, update it, then PUT it back. We are using client side encryption.
From [AWS docs:](https://docs.aws.amazon.com/dynamodb-encryption-client/latest/devguide/java-examples.html)
> Because your DynamoDB Mapper is configured to use the PUT save behavior, the item replaces any item with the same primary keys, instead of updating it. This ensures that the signatures match and you can decrypt the item when you get it from the table.
This opens up our application to race conditions, I.E. the record could be updated by another process sometime between the GET and PUT. We have looked into other solutions for this, such as using a [conditional expression or version attribute](https://docs.amazonaws.cn/en_us/amazondynamodb/latest/developerguide/DynamoDBMapper.OptimisticLocking.html) that would throw a [ConditionalCheckFailedException](https://sdk.amazonaws.com/java/api/2.0.0/software/amazon/awssdk/services/dynamodb/model/ConditionalCheckFailedException.html) if the record has been modified by another process and retrying. There are disadvantages to this -- for example if we have a lot of processes accessing the same record in a small amount of time, there could be a lot of retries and overall latency increase.
Is there some way to use DynamoDB's update API on a record with encrypted and signed attributes?
Hi,
We are trying to setup a **SCP** which will deny some **DynamoDB** actions based on the **IP Ranes ** of our Network, the way that IAM Users for example can't Scan or Query a DynamoDB table outside of our Network.
In this SCP we need to add a**n exception** to some AWS Services (Like: **EC2** or **Lambda**) which can freely Query/Scan a DynamoDB table if they have the necessary permissions.
We tried with the following SCP and it worked fine for the first case "IAM Users" but failed for the Lambda case as we still recieving an AccessDenied Error trying to Query a DynamoDB table from a Lambda Function :
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Deny",
"Action": "dynamodb:*",
"Resource": "*",
"Condition": {
"Bool": {
"aws:ViaAWSService": "false"
},
"NotIpAddress": {
"aws:SourceIp": [
"IP Range"
]
}
}
}
]
}
```
Do you know how we can manage to add this exception for all AWS Services which need to perform any DynamoDB action without the need to use the ARN of specific IAM Role used by these service ?
Error while fetching the data from dynamo db from athena
Error:
GENERIC_USER_ERROR: Encountered an exception[java.lang.ClassCastException] from your LambdaFunction[arn:aws:lambda:ap-south-1:458993339053:function:dynamodbdata] executed in context[S3SpillLocation{bucket='swaasa-athena-db-spill-ap-south-1', key='athena-spill/d38cdaf7-7fb2-48b9-acc8-9139de2fe525/821dc5fc-f893-40f9-b25e-96b832e55c4a', directory=true}] with message[class java.lang.String cannot be cast to class java.math.BigDecimal (java.lang.String and java.math.BigDecimal are in module java.base of loader 'bootstrap')]
Tried using the try_cast but still seeing the issue
query id: 0a47ace5-b8fc-46cb-8ff5-23194925882e
Hi, I wanted to check my understanding of what happens in the following scenario, and to find out if there's a way around it.
I have a bunch of global DynamoDB tables set up to store metadata about Documents and their Versions. When a new Document gets created I use TransactWriteItems to write both to the documents table and an ownership table (using PutItem). Each of the actions has a Condition Expression which fails if the item already exists in the table. Let's say the tables are available in regions A and B. Is the following series of events possible?
1. A user in region A creates a new document, which invokes the TransactWriteItems operation
2. Simultaneously, a user in region B creates a new document with the same ID
3. The PutItem to the documents table in region A succeeds, but for some reason there's a delay calling the PutItem to the ownership table
4. The PutItem to the documents table in region B succeeds
5. The PutItem to the ownership table in region B succeeds
6. DynamoDB replicates the information across from region B to region A
7. The PutItem to the ownership table in region B gets called, but because the item from region A has been replicated over, the Condition Check fails
What happens in this scenario? When step 6 happens, do we end up with a last-writer-wins resolution where the item in the documents table in region A gets overwritten? Does the DynamoDB instance in region A attempt to rollback the transaction (which presumably has been overwritten already)? What if the PutItem to the ownership table in region A happens after the PutItem to the ownership table in region B, but before the information gets replicated across regions? Is there a way to ensure we don't have a situation where the user sees the transaction succeed, but ends up with different data in the table from the one they submitted?
(I've looked at https://repost.aws/questions/QUP7oi53yZTnaQqOIFJAYboA/concurrent-dynamo-db-transact-write-items-requests-with-same-condition-check but their question is different and doesn't involve global tables)
Hi,
I am working with 10 iot devices which sends data to IOT core and then store to dynamodb. I want to design web interface to display this data.
Now in dynamodb, i have same partition key value and sort key value. now my question is data i receive in dynamodb will be sorted order?
Also suggest any way to handle data from multiple devices.
I used to use the REST api, but since v2 i find myself using it more.
Is there a proper way to return data "neatly" other than manipulating the database response before returning?
I used to use the model feature with REST (v1). I was wondering what's the recommended way to do the same here. tnx.
Here's an example of what i'm trying to do.
I'm selecting specific columns, while avoiding the error:
> An error occurred (ValidationException) when calling the UpdateItem operation: Invalid UpdateExpression: Attribute name is a reserved keyword "owner"
and since integers/floats return as a "Decimal":
> Object of type Decimal is not JSON serializable
i added the class to set them properly as integers/floats.
I'd be happy to get some heads up unrelated to my question too. Tnx.
```
import json
import boto3
from decimal import Decimal
class DecimalEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, Decimal):
return str(obj)
return json.JSONEncoder.default(self, obj)
def lambda_handler(event, context):
try:
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('SomeTable')
response_body = ''
status_code = 0
response = table.scan(
ProjectionExpression="#col1, #col2, #col3, #col4, #col5",
ExpressionAttributeNames={
"#col1": "col1",
"#col2": "col2",
"#col3": "col3",
"#col4": "col4",
"#col5": "col5"
}
)
items = response["Items"]
mapped_items = list(map(lambda item: {
'col1': item['col1'],
'col2': item['col2'],
'col3': item['col3'],
'col4': item['col4'],
'col5': item['col5'],
}, items))
response_body = json.dumps(mapped_items, cls=DecimalEncoder)
status_code = 200
except Exception as e:
response_body = json.dumps(
{'error': 'Unable to get metadata from SomeTable: ' + str(e)})
status_code = 403
json_response = {
"statusCode": status_code,
"headers": {
"Content-Type": "application/json"
},
"body": response_body
}
return json_response
```
This just looks too much for a simple "GET" request of some columns in a table
I have 2 databases that I am using, first DynamoDB and second one is TimestreamDB. And I am trying to query from both the databases using AppSync Graphql API.
for that I am adding multiple dynamo tables as separate data sources, and for timestream I am creating VPC endpoint for TimestreamDB, and adding HTTP data source for that.
Now the question is, I can create schema, query and resolvers for Dynamo Tables. And the AWS AppSync documentation says that for now, only public endpoints are working with AppSync. Ref:"https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-http-resolvers.html"
So is there any other way I can satisfy my requirements of connecting Timestream HTTP endpoint with AppSYnc?
I'm getting the following error from cloudformation: 
My snippet of template:
```
ConnectionsTable:
Type: AWS::DynamoDB::Table
DeletionPolicy: Delete
UpdateReplacePolicy: Delete
Properties:
AttributeDefinitions:
- AttributeName: UserID
AttributeType: 'S'
- AttributeName: WebsocketID
AttributeType: 'S'
KeySchema:
- AttributeName: UserID
KeyType: HASH
SSESpecification:
KMSMasterKeyId: !Ref ConnectionsTableKey
SSEEnabled: true
SSEType: KMS
GlobalSecondaryIndexes:
- IndexName: "WebsocketID"
KeySchema:
- AttributeName: WebsocketID
KeyType: HASH
Projection:
NonKeyAttributes:
- AgentID
ProjectionType: "INCLUDE"
ProvisionedThroughput:
ReadCapacityUnits: "0"
WriteCapacityUnits: "0"
BillingMode: PAY_PER_REQUEST
PointInTimeRecoverySpecification:
PointInTimeRecoveryEnabled: false
TimeToLiveSpecification:
AttributeName: ExpiryTimestamp
Enabled: true
```
I've tried without the `ProvisionedThroughput` block in the GSI, I've tried with it. I've tried without the GSI (that works), but then adding it in fails again.
I can't replicate it either - I've created a new template with just this table, and it creates quite happily.
Where am I going wrong?
I’m a newer on AWS IoT Greengrass and Raspberry Pi. I would like to do object detection on 2 or more raspberry pis, and show the results on a webpage by using AWS IoT Greengrass. Is this possible to do so? If yes, is there any workshop or example can guide me to do that? Thank you very much!
I have a DynamoDB table and I noticed I was getting inconsistent results (in my web app) as I added more data to the table.
I added 200 test rows to the table. The data includes a date.
I entered 4 rows for a specific date. I can use the DynamoDB console to filter these items and see the 4 records that match the date as expected.
When I issue a GraphQL query in the Appsync console filtered on the same date, I only get 2 items returned.
Hopefully, these images show both the DynamoDB and the Appsync consoles


Any ideas?
Thanks
Kevin