Ground Truth Job Validation post completion

1

We annotated 10 pdf files in Ground Truth, how do I validate the annotations done by the team ? Do we have any metrics ? Ex - How many annotations done in one pdf ? What is the confidence score for each annotation ?

My idea is that if i get this metrics, I will review the doc with less number of annotations and doc with low confidence score.

Can Ground truth expers provide some insights in this ?

  • Can you tell more about this task? What are you labelling in the PDFs? Each PDF has a label? You are labelling sentences in the PDFs?

1 Answer
0

This is a machine learning problem and goes beyond AWS Ground Truth.

Usually you do not get to measure how confident each annotation is, unless you asked the annotators to say how confident they are on each annotation.

Usually you have to provide some gold standard annotations that you think are correct. You do this on some sample PDFs that you annotated yourself. Then you check the performance of each annotator on your set. You can compute metrics such as Cohen's Kappa to assess agreement between annotators: https://en.wikipedia.org/wiki/Cohen%27s_kappa

answered 13 days ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions