내용으로 건너뛰기

Dynamodb csv import task failed, but table successfully created with correct amount of data imported to it

0

Very weird situation: I was using the "Import from S3" function in DynamoDB console to import a CSV file with 300 rows of data from a S3 bucket. The status was failed with error "Some of the items failed validation checks and were not imported." When I checked the error log in CloudWatch, it shows "Unable to map value based on the given schema. Remainder of the file will not be processed."

However, the output dynamodb table was successfully created, and the amount of data imported to the table was correct (300 items). Everything looks right except the fact that the import task was failed and I received error messages.... Could anyone suggest what might be wrong here?

질문됨 3년 전4.7천회 조회
2개 답변
2
수락된 답변

This looks like you have a 0 byte object in S3, which you believe to be your folder/directory. S3 does not natively support folders but if you use S3 console to create a folder then it creates a 0 byte S3 object.

When you create a folder in Amazon S3, S3 creates a 0-byte object with a key that's set to the folder name that you provided. For example, if you create a folder named photos in your bucket, the Amazon S3 console creates a 0-byte object with the key photos/. The console creates this object to support the idea of folders.

The Amazon S3 console treats all objects that have a forward slash (/) character as the last (trailing) character in the key name as a folder (for example, examplekeyname/). You can't upload an object that has a key name with a trailing / character by using the Amazon S3 console. However, you can upload objects that are named with a trailing / with the Amazon S3 API by using the AWS CLI, AWS SDKs, or REST API.

An object that is named with a trailing / appears as a folder in the Amazon S3 console. The Amazon S3 console does not display the content and metadata for such an object. When you use the console to copy an object named with a trailing /, a new folder is created in the destination location, but the object's data and metadata are not copied.

By design, the import from S3 feature will scan for all S3 objects under a given prefix and attempt to read them. Since <folder name>/ is an S3 object we just treat it as data.

To overcome this issue, try not to create folders using the AWS Web Console, and rather upload your data directly from the CLI or SDK:

aws s3 cp <file> s3://bucket/prefix/to/file.csv

AWS
전문가
답변함 3년 전
전문가
검토됨 2년 전
0

In addition to the answer Leeroy provided above, I've also found that any duplicate entry in the CSV file could fail the task, since each item has to be unique in the DynamoDB table. Go to Imports from S3 --> Select an import job --> Import Details --> Check error count to see the number of items DynamoDB skipped during the import

답변함 3년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.