Questions tagged with Amazon DynamoDB
Content language: English
Sort by most recent
I am working on adding a new library that depends on AWS JDK to my project, however, some other untouched existing component querying DynamoDB throws the exception:
```
Caused by: com.amazonaws.services.dynamodbv2.model.AmazonDynamoDBException: User not found: the_user_key (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: UnrecognizedClientException; Request ID: null; Proxy: null)
```
`the_user_key` works well before I add the new library, but it suddenly says the user was not found.
I am using `com.amazonaws:aws-java-sdk-dynamodb:1.11.704`, `"com.amazonaws:aws-java-sdk-core:1.11.704"`, and there are also a bunch of `software.amazon.awssdk` with version 2.16.87.
I am wondering if anybody encountered similar issues before, or if anybody could give any advice.
Hi there,
we are implementing a job portal based on a serverless architecture in aws. All the data should be saved in one single dynamodb table. Some of the Entities are User (Candidate or Employer), Job and Application. The biggest challenge we are facing right now is, saving the jobs in a way that the a job search (from candidates) is as accurate as possible. The access pattern should look like this:
*Find all Jobs containing this hard skills With this EmploymentType With this experienceLevel for this Location*
On the other hand, the access pattern from the employers perspective, is much easier: Find All Jobs of my Company.
We appreciate every hint or idea how to model this in a dynamodb?
Thanks in advance.
Regards from Germany
PS: The more we think about it, the more we have the feelding, that especially the search for jobs from the candidate perspective needs a solution like Amazon CloudSearch or elasticsearch. But before we make this decision, we wanted to have some more opinions.
In our lessons, we were told to test employee-directory-app-exercise6 but when I added the photos and attributes, it confirm that it saved but it didn't add to the employee app and dynamoDB table. I tried it thrice same issue. Please help me pass this phase.
Hello,
Im trying to setup DAX to handle caching for our DynamoDB logic for our existing kubernetes cluster.
However, when I follow the guides, they are incomplete.
From official doc here:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DAX.create-cluster.console.create-subnet-group.html
1. Open the **DynamoDB **console at https://console.aws.amazon.com/dynamodb/.
2. In the navigation pane, under **DAX, choose Subnet groups.**
However there are NO such thing as "DAX" under DynamoDB. There is simply create table etc. When I search DAX in the console, I get no hits.
How exactly am I to understand how this is to be done when the official guide itself isnt correct?
Same with guides I've found, they simply do not align with how it looks in real life.
Help much appreciated since our Prod enviroment is in dire need of this ASAP.
Kind regards
Olle
Hi, I am working on A9G module in IOT , wherein I am trying to send data to AWS DynamoDB via https URL.
As part of a GPS tracker device, I'm utilizing AT instructions to connect with an Ai-Thinker A9G processor from an ESP8266. Starting off smoothly, HTTPS requests begin to fail after sending 7 or 8 successfully. Requesting HTTP is still acceptable, though.
What could be the reason that HTTP requests perform just fine but HTTPS requests fail after the first 7 or 8? And How can I solve this issue?
Hi AWS experts,
I have a use case where I am trying to insert a large xml string as an attribute value to an item. I do see that the value is getting truncated to 32604 characters after saving it. I do know that an item max size can be 400KB which includes sum of all attribute names and attribute values and this is inclusive of the same item which is projected in LSI of that DynamoDB table. My question is this what is expected if we try to insert large String in an item ? Is my assumption correct ?
I am trying to invoke a lambda to store data in a Dynamodb table. In my own AWS account, it works, but not in the company AWS account I'm working at. Cloudwatch does not show any errors. The timeout occurs at "await dynamodb.describeTable(describeParams).promise();".

My code is as follows:
```
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient();
const dynamodb = new AWS.DynamoDB();
exports.handler = async (event) => {
const valueTostore = event.body || 'default_value';
const params = {
TableName: 'my-values',
Item: {
id: new Date().toISOString(),
SessionConfig: valueTostore
}
};
try {
const describeParams = { TableName: 'my-values' };
await dynamodb.describeTable(describeParams).promise();
} catch (error) {
const response = {
statusCode: 500,
body: JSON.stringify({ message: 'Error while accessing table' })
};
return response;
}
try {
await docClient.put(params).promise();
} catch (error) {
const response = {
statusCode: 500,
body: JSON.stringify({ message: 'Error while storing value' })
};
return response;
}
const response = {
statusCode: 200,
body: JSON.stringify({ message: 'Value stored successfully' })
};
return response;
};
```
How do I connect an EventBridge Bus directly to an EventBridge Pipes as a Source. So EventBridge Bus -> EventBridge Pipes -> Enrichment (Lambda) -> Pipes Target Event Pattern -> Target (Lambda). As far as I can tell by the documentation and console ops I can only select Steaming services as Pipes Sources. Is this a limitation that is fixed forever?
The scenario I was wanting to implement was my EventBridge Bus events being enriched with feature flag detail pre-populated based on identity and detail-type and to discourage target services making any tightly coupled call(s) to feature flag service. I thought EventBridge Pipes sound best idea as no code would have to be written to plum messages along the "Pipeline" just the Lambda code to enrich messages.
One possible work around I was planning to try was to setup my pipeline. EventBridge Bus -> Rule Event Pattern (*) -> Lambda Target (enriches events based on data from DynamoDb Table w/ Cache) and then code to push events to a second EventBridge Bus -> EventBridge Bus -> Rule Event Pattern(s) -> Target(s).
Would love expert suggestions for alternatives or maybe that this is a planned feature change.
Thanks
I have a problem where I have a Building and the building can be in multiple projects. I need to query all the buildings that belong to a specific project.
I tried to do something like this but contains cannot be used to search for the flags.
PK => company#BUILDINGS
SK => buildingId#INFO
LSI => flags
I know that:
* I can do it by duplicating the building record adding the project in the range key.
* Local Secondary Index per project cannot solve this as the number of projects is grater then 5
* Global Secondary Index can solve this ( but having many global secondary index ( one per project ) would increase writes cost for each building update ( the cost is not that big of a issue, but still have a limit of 20, more than needed at the moment, but I wouldn't like to have that type of hard limit and would prefer to duplicate the records instead )
This could be easily ( from users query perspective ) be solved using:
* Numeric range key and bit-wise to search for the project.
* String range key and **contains** instead of only **begins_with**
* String range key with pattern
* something like Range Value = "01100" and search for begins_with "??1"
I'm new to dynamoDB, and I'm migrating a mongoDB project to it. So I may be missing something.
is there a easy way to solve this query problem that I'm unaware of?
thanks in advance
Hello,
I am working on Build a serverless web application using Amplify, Cognito, API Gateway, Lambda & DynamoDB use case
https://aws.amazon.com/getting-started/hands-on/build-serverless-web-app-lambda-apigateway-s3-dynamodb-cognito/module-1/
My question is have created a Repository name "wildrydes-site". Now i am trying to perform Populate git repository i have created the bucket and object . As per the use case which html file i have upload to get the same website which is shown in the use case. Can somebody help me from where i get same website file to upload also after uploading the file do i have to run these commands:
a. Change directory into your repository and copy the static files from S3:
cd wildrydes-site
aws s3 cp s3://wildrydes-us-east-1/WebApplication/1_StaticWebHosting/website ./ --recursive
b. Commit the files to your Git service
$ git add .
$ git commit -m 'new'
$ git push
Please suggest.
Thanks,
hello, I would like to know how TTL column works in case we are working with item collections. For example, if wanted to add a TTL column to this schema, I'm not sure about how to share the same value across all the records related to the same PK.

Thanks.
Sorry, I'm not a dev or coder by any means, but I am trying to "professionalize" the application I developed using no code. I was able to recreate my Schema within Amplify studio but I'm struggling with a basic coding concept:
In Airtable and in Access (what I'm used to), you can make fields that are essentially queries of a DB. Each data from the table is in view the DB will query and populate that field to represent the most recent data. In airtable, they're called calculated fields. In developing the Amplify Studio Schema I found there were no "Calculated fields" and no visual way to build queries and integrate them into the system.
So my question is this: **if I wanted a persistent query for a table, or to display visually on my app, where would I input it during my development?** e.g. The number of comments for a post given a relationship of Many comments for every One Post.
* is it connected to the UI by React-native code in Amplify Studio?
* Is it a function in AWS Lambda (Node.js) that is constantly triggered and then updates DynamoDB?
* Is it a Datastore code filled out in AWS CLI?
* Is there a visual editor within Amplify Studio for "Calculated Fields" and Functions?
Please walk me through it.