Questions tagged with Amazon DynamoDB

Content language: English

Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

Hi, I am working on A9G module in IOT , wherein I am trying to send data to AWS DynamoDB via https URL. As part of a GPS tracker device, I'm utilizing AT instructions to connect with an Ai-Thinker A9G processor from an ESP8266. Starting off smoothly, HTTPS requests begin to fail after sending 7 or 8 successfully. Requesting HTTP is still acceptable, though. What could be the reason that HTTP requests perform just fine but HTTPS requests fail after the first 7 or 8? And How can I solve this issue?
0
answers
0
votes
7
views
asked 3 hours ago
Hi AWS experts, I have a use case where I am trying to insert a large xml string as an attribute value to an item. I do see that the value is getting truncated to 32604 characters after saving it. I do know that an item max size can be 400KB which includes sum of all attribute names and attribute values and this is inclusive of the same item which is projected in LSI of that DynamoDB table. My question is this what is expected if we try to insert large String in an item ? Is my assumption correct ?
Accepted AnswerAmazon DynamoDB
2
answers
0
votes
20
views
asked 3 days ago
I am trying to invoke a lambda to store data in a Dynamodb table. In my own AWS account, it works, but not in the company AWS account I'm working at. Cloudwatch does not show any errors. The timeout occurs at "await dynamodb.describeTable(describeParams).promise();". ![Calling the invoke API action failed with this message: Network Failure timeout](/media/postImages/original/IMvwBjv-QCSNWBn1sD7PSBiQ) My code is as follows: ``` const AWS = require('aws-sdk'); const docClient = new AWS.DynamoDB.DocumentClient(); const dynamodb = new AWS.DynamoDB(); exports.handler = async (event) => { const valueTostore = event.body || 'default_value'; const params = { TableName: 'my-values', Item: { id: new Date().toISOString(), SessionConfig: valueTostore } }; try { const describeParams = { TableName: 'my-values' }; await dynamodb.describeTable(describeParams).promise(); } catch (error) { const response = { statusCode: 500, body: JSON.stringify({ message: 'Error while accessing table' }) }; return response; } try { await docClient.put(params).promise(); } catch (error) { const response = { statusCode: 500, body: JSON.stringify({ message: 'Error while storing value' }) }; return response; } const response = { statusCode: 200, body: JSON.stringify({ message: 'Value stored successfully' }) }; return response; }; ```
3
answers
0
votes
23
views
combii
asked 4 days ago
How do I connect an EventBridge Bus directly to an EventBridge Pipes as a Source. So EventBridge Bus -> EventBridge Pipes -> Enrichment (Lambda) -> Pipes Target Event Pattern -> Target (Lambda). As far as I can tell by the documentation and console ops I can only select Steaming services as Pipes Sources. Is this a limitation that is fixed forever? The scenario I was wanting to implement was my EventBridge Bus events being enriched with feature flag detail pre-populated based on identity and detail-type and to discourage target services making any tightly coupled call(s) to feature flag service. I thought EventBridge Pipes sound best idea as no code would have to be written to plum messages along the "Pipeline" just the Lambda code to enrich messages. One possible work around I was planning to try was to setup my pipeline. EventBridge Bus -> Rule Event Pattern (*) -> Lambda Target (enriches events based on data from DynamoDb Table w/ Cache) and then code to push events to a second EventBridge Bus -> EventBridge Bus -> Rule Event Pattern(s) -> Target(s). Would love expert suggestions for alternatives or maybe that this is a planned feature change. Thanks
1
answers
0
votes
16
views
asked 5 days ago
I have a problem where I have a Building and the building can be in multiple projects. I need to query all the buildings that belong to a specific project. I tried to do something like this but contains cannot be used to search for the flags. PK => company#BUILDINGS SK => buildingId#INFO LSI => flags I know that: * I can do it by duplicating the building record adding the project in the range key. * Local Secondary Index per project cannot solve this as the number of projects is grater then 5 * Global Secondary Index can solve this ( but having many global secondary index ( one per project ) would increase writes cost for each building update ( the cost is not that big of a issue, but still have a limit of 20, more than needed at the moment, but I wouldn't like to have that type of hard limit and would prefer to duplicate the records instead ) This could be easily ( from users query perspective ) be solved using: * Numeric range key and bit-wise to search for the project. * String range key and **contains** instead of only **begins_with** * String range key with pattern * something like Range Value = "01100" and search for begins_with "??1" I'm new to dynamoDB, and I'm migrating a mongoDB project to it. So I may be missing something. is there a easy way to solve this query problem that I'm unaware of? thanks in advance
1
answers
0
votes
24
views
Fazel
asked 6 days ago
Hello, I am working on Build a serverless web application using Amplify, Cognito, API Gateway, Lambda & DynamoDB use case https://aws.amazon.com/getting-started/hands-on/build-serverless-web-app-lambda-apigateway-s3-dynamodb-cognito/module-1/ My question is have created a Repository name "wildrydes-site". Now i am trying to perform Populate git repository i have created the bucket and object . As per the use case which html file i have upload to get the same website which is shown in the use case. Can somebody help me from where i get same website file to upload also after uploading the file do i have to run these commands: a. Change directory into your repository and copy the static files from S3: cd wildrydes-site aws s3 cp s3://wildrydes-us-east-1/WebApplication/1_StaticWebHosting/website ./ --recursive b. Commit the files to your Git service $ git add . $ git commit -m 'new' $ git push Please suggest. Thanks,
1
answers
0
votes
62
views
Monica
asked 7 days ago
hello, I would like to know how TTL column works in case we are working with item collections. For example, if wanted to add a TTL column to this schema, I'm not sure about how to share the same value across all the records related to the same PK. ![Enter image description here](/media/postImages/original/IMaCFpa3f-Qou8eG0S0R4Lzg) Thanks.
1
answers
0
votes
11
views
asked 8 days ago
Sorry, I'm not a dev or coder by any means, but I am trying to "professionalize" the application I developed using no code. I was able to recreate my Schema within Amplify studio but I'm struggling with a basic coding concept: In Airtable and in Access (what I'm used to), you can make fields that are essentially queries of a DB. Each data from the table is in view the DB will query and populate that field to represent the most recent data. In airtable, they're called calculated fields. In developing the Amplify Studio Schema I found there were no "Calculated fields" and no visual way to build queries and integrate them into the system. So my question is this: **if I wanted a persistent query for a table, or to display visually on my app, where would I input it during my development?** e.g. The number of comments for a post given a relationship of Many comments for every One Post. * is it connected to the UI by React-native code in Amplify Studio? * Is it a function in AWS Lambda (Node.js) that is constantly triggered and then updates DynamoDB? * Is it a Datastore code filled out in AWS CLI? * Is there a visual editor within Amplify Studio for "Calculated Fields" and Functions? Please walk me through it.
0
answers
0
votes
10
views
asked 9 days ago
Hi, I'm encountering an issue with the AWS Amplify DataStore save function when updating an object that includes an array property. When I use the DataStore save function to update an object with an array property, and the update input includes a new array value, the array property is not updated correctly and the new value is merged with any existing values. Example: Suppose I have an object with an array property called "timeToArrive" that has the value [2, 4]. I use the DataStore save function to update the object, with an update input that includes the new value [1, 1] for the "timeToArrive" property: const original = await DataStore.query(MyModel, id); await DataStore.save(MyModel.copyOf(original, updated => { updated.timeToArrive = [1, 1]; })); Instead of setting the "timeToArrive" property to [1, 1], the DataStore save function merges the new value with the existing values, resulting in the value [2, 4, 1, 1] for the "timeToArrive" property. Attempts to fix: I've tried using different approaches to update the array property, including copying the new values using a for loop or the spread operator, and using a custom resolver to update the object. However, none of these approaches have resolved the issue. Expected behavior: I expect the DataStore save function to update the array property correctly, by setting the property to the new value without merging it with any existing values. Current setup: AWS Amplify version: 5.0.18 AWS services used: DataStore, DynamoDB Schema definition for model: type MyModel @model { id: ID! timeToArrive: [Int] } Any help or suggestions on how to resolve this issue would be greatly appreciated.
0
answers
0
votes
13
views
asked 9 days ago
Hi, having an issue with incorrect results returned from a dynamodb scan, either from the console or programatically. First image shows the correct result for a query (5 rows returned): https://www.dropbox.com/s/7x6s9q89q7gd5go/dunamodb1.png?dl=0 Second image shows same results from a scan - either in the console or through node.js code (4 rows returned) https://www.dropbox.com/s/kpk3wus2vlf1ywz/dynamodb2.png?dl=0 There are actually 840 rows in this table if i just move through 50 rows at a time by viewing table details, but a scan only returns 813 rows. Thx
3
answers
0
votes
33
views
leehu
asked 11 days ago
Hi, we are unable to use DynamoDB's update API because our records are encrypted and signed. To maintain a valid encryption signature, our service has to first GET the record from DynamoDB, update it, then PUT it back. We are using client side encryption. From [AWS docs:](https://docs.aws.amazon.com/dynamodb-encryption-client/latest/devguide/java-examples.html) > Because your DynamoDB Mapper is configured to use the PUT save behavior, the item replaces any item with the same primary keys, instead of updating it. This ensures that the signatures match and you can decrypt the item when you get it from the table. This opens up our application to race conditions, I.E. the record could be updated by another process sometime between the GET and PUT. We have looked into other solutions for this, such as using a [conditional expression or version attribute](https://docs.amazonaws.cn/en_us/amazondynamodb/latest/developerguide/DynamoDBMapper.OptimisticLocking.html) that would throw a [ConditionalCheckFailedException](https://sdk.amazonaws.com/java/api/2.0.0/software/amazon/awssdk/services/dynamodb/model/ConditionalCheckFailedException.html) if the record has been modified by another process and retrying. There are disadvantages to this -- for example if we have a lot of processes accessing the same record in a small amount of time, there could be a lot of retries and overall latency increase. Is there some way to use DynamoDB's update API on a record with encrypted and signed attributes?
1
answers
0
votes
20
views
asked 12 days ago
Hi, We are trying to setup a **SCP** which will deny some **DynamoDB** actions based on the **IP Ranes ** of our Network, the way that IAM Users for example can't Scan or Query a DynamoDB table outside of our Network. In this SCP we need to add a**n exception** to some AWS Services (Like: **EC2** or **Lambda**) which can freely Query/Scan a DynamoDB table if they have the necessary permissions. We tried with the following SCP and it worked fine for the first case "IAM Users" but failed for the Lambda case as we still recieving an AccessDenied Error trying to Query a DynamoDB table from a Lambda Function : ``` { "Version": "2012-10-17", "Statement": [ { "Effect": "Deny", "Action": "dynamodb:*", "Resource": "*", "Condition": { "Bool": { "aws:ViaAWSService": "false" }, "NotIpAddress": { "aws:SourceIp": [ "IP Range" ] } } } ] } ``` Do you know how we can manage to add this exception for all AWS Services which need to perform any DynamoDB action without the need to use the ARN of specific IAM Role used by these service ?
2
answers
0
votes
67
views
asked 14 days ago