- Newest
- Most votes
- Most comments
This is a machine learning problem and goes beyond AWS Ground Truth.
Usually you do not get to measure how confident each annotation is, unless you asked the annotators to say how confident they are on each annotation.
Usually you have to provide some gold standard annotations that you think are correct. You do this on some sample PDFs that you annotated yourself. Then you check the performance of each annotator on your set. You can compute metrics such as Cohen's Kappa to assess agreement between annotators: https://en.wikipedia.org/wiki/Cohen%27s_kappa
Sorry for not giving clarity in the question. I actually want the work completion status. Is there a way to get summary of daily task completion details ? Some example data that I want to get is Number of documents labelled everyday, the count of documents annotated by each person every day.
This metric is very important to track the progress and make a follow up.
(I made an edit in the question) Thanks.
Relevant content
- asked 5 years ago
- asked 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago
Can you tell more about this task? What are you labelling in the PDFs? Each PDF has a label? You are labelling sentences in the PDFs?
I am labelling the entities (our use case specific entities) present in the PDF
From here https://github.com/aws-samples/aws-sagemaker-ground-truth-recipe/blob/master/aws_sagemaker_ground_truth_sample_lambda/annotation_consolidation_lambda.py it seems that you can only get the worker ID at the moment. I don't know if you could add a timestamp for each annotation by modifying a custom template https://docs.amazonaws.cn/en_us/sagemaker/latest/dg/sms-custom-templates-step3.html But this might be complicated