Moving data from s3 to dynamo_db table

0

I have a glue job that moves data from S3 to DynamoDb. It works when it is not a large job but when I have to do it for larger data sets, I increase the write capacity to 10,000 but the dynamodb table doesn't recognize this as it is consuming at 1 instead of 10,000. Is there something I am missing?

ApplyMapping_node2 = ApplyMapping.apply( frame=S3bucket_node1, mappings=[ ("hh_key_before.n", "string", "hh_key_before", "long"), ("reason.s", "string", "reason", "string"), ("change_seq.n", "string", "change_seq", "long"), ("key_status_after.s", "string", "key_status_after", "string") ], transformation_ctx="ApplyMapping_node2", )

dynamodb_output_options = { "dynamodb.region": "us-east-1", "dynamodb.output.tableName": dynamodb_table, "dynamodb.throughput.write.percent": "1.0" }

Write the DataFrame to DynamoDB

glueContext.write_dynamic_frame.from_options( frame=ApplyMapping_node2, connection_type="dynamodb", connection_options=dynamodb_output_options )

已提问 10 个月前230 查看次数
1 回答
0

In your code snippet, the dynamodb.throughput.write.percent option is set to "1.0", which means you are using 1% of the provisioned write capacity of your DynamoDB table. This is why you're observing a consumption rate of 1 instead of 10,000.

To utilize the increased write capacity of 10,000, you need to update the dynamodb.throughput.write.percent option to "100.0". This will ensure that the Glue job utilizes the full provisioned write capacity of your DynamoDB table.

Here's the updated code snippet:

dynamodb_output_options = {
    "dynamodb.region": "us-east-1",
    "dynamodb.output.tableName": dynamodb_table,
    "dynamodb.throughput.write.percent": "100.0"
}

# Write the DataFrame to DynamoDB
glueContext.write_dynamic_frame.from_options(
    frame=ApplyMapping_node2,
    connection_type="dynamodb",
    connection_options=dynamodb_output_options
)

By setting the dynamodb.throughput.write.percent option to "100.0", the Glue job will fully utilize the provisioned write capacity of your DynamoDB table, which in this case is 10,000 write capacity units.

已回答 10 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则