Dynamodb csv import task failed, but table successfully created with correct amount of data imported to it

0

Very weird situation: I was using the "Import from S3" function in DynamoDB console to import a CSV file with 300 rows of data from a S3 bucket. The status was failed with error "Some of the items failed validation checks and were not imported." When I checked the error log in CloudWatch, it shows "Unable to map value based on the given schema. Remainder of the file will not be processed."

However, the output dynamodb table was successfully created, and the amount of data imported to the table was correct (300 items). Everything looks right except the fact that the import task was failed and I received error messages.... Could anyone suggest what might be wrong here?

Chen
asked a year ago2168 views
2 Answers
2
Accepted Answer

This looks like you have a 0 byte object in S3, which you believe to be your folder/directory. S3 does not natively support folders but if you use S3 console to create a folder then it creates a 0 byte S3 object.

When you create a folder in Amazon S3, S3 creates a 0-byte object with a key that's set to the folder name that you provided. For example, if you create a folder named photos in your bucket, the Amazon S3 console creates a 0-byte object with the key photos/. The console creates this object to support the idea of folders.

The Amazon S3 console treats all objects that have a forward slash (/) character as the last (trailing) character in the key name as a folder (for example, examplekeyname/). You can't upload an object that has a key name with a trailing / character by using the Amazon S3 console. However, you can upload objects that are named with a trailing / with the Amazon S3 API by using the AWS CLI, AWS SDKs, or REST API.

An object that is named with a trailing / appears as a folder in the Amazon S3 console. The Amazon S3 console does not display the content and metadata for such an object. When you use the console to copy an object named with a trailing /, a new folder is created in the destination location, but the object's data and metadata are not copied.

By design, the import from S3 feature will scan for all S3 objects under a given prefix and attempt to read them. Since <folder name>/ is an S3 object we just treat it as data.

To overcome this issue, try not to create folders using the AWS Web Console, and rather upload your data directly from the CLI or SDK:

aws s3 cp <file> s3://bucket/prefix/to/file.csv

profile pictureAWS
EXPERT
answered a year ago
profile picture
EXPERT
reviewed a month ago
0

In addition to the answer Leeroy provided above, I've also found that any duplicate entry in the CSV file could fail the task, since each item has to be unique in the DynamoDB table. Go to Imports from S3 --> Select an import job --> Import Details --> Check error count to see the number of items DynamoDB skipped during the import

Chen
answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions