Moving data from s3 to dynamo_db table

0

I have a glue job that moves data from S3 to DynamoDb. It works when it is not a large job but when I have to do it for larger data sets, I increase the write capacity to 10,000 but the dynamodb table doesn't recognize this as it is consuming at 1 instead of 10,000. Is there something I am missing?

ApplyMapping_node2 = ApplyMapping.apply( frame=S3bucket_node1, mappings=[ ("hh_key_before.n", "string", "hh_key_before", "long"), ("reason.s", "string", "reason", "string"), ("change_seq.n", "string", "change_seq", "long"), ("key_status_after.s", "string", "key_status_after", "string") ], transformation_ctx="ApplyMapping_node2", )

dynamodb_output_options = { "dynamodb.region": "us-east-1", "dynamodb.output.tableName": dynamodb_table, "dynamodb.throughput.write.percent": "1.0" }

Write the DataFrame to DynamoDB

glueContext.write_dynamic_frame.from_options( frame=ApplyMapping_node2, connection_type="dynamodb", connection_options=dynamodb_output_options )

asked 9 months ago219 views
1 Answer
0

In your code snippet, the dynamodb.throughput.write.percent option is set to "1.0", which means you are using 1% of the provisioned write capacity of your DynamoDB table. This is why you're observing a consumption rate of 1 instead of 10,000.

To utilize the increased write capacity of 10,000, you need to update the dynamodb.throughput.write.percent option to "100.0". This will ensure that the Glue job utilizes the full provisioned write capacity of your DynamoDB table.

Here's the updated code snippet:

dynamodb_output_options = {
    "dynamodb.region": "us-east-1",
    "dynamodb.output.tableName": dynamodb_table,
    "dynamodb.throughput.write.percent": "100.0"
}

# Write the DataFrame to DynamoDB
glueContext.write_dynamic_frame.from_options(
    frame=ApplyMapping_node2,
    connection_type="dynamodb",
    connection_options=dynamodb_output_options
)

By setting the dynamodb.throughput.write.percent option to "100.0", the Glue job will fully utilize the provisioned write capacity of your DynamoDB table, which in this case is 10,000 write capacity units.

answered 9 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions