Validating annotations in AWS ground truth

2

Is there a way in AWS ground truth to validate annotations from workers during the labeling task and remove the worker if the performance is really poor before they finish the task?

  • I am also looking for similar feature. Currently we have a person labelling the pdf document. We want to review their work, is there a way to randomly check the pdf with the annotated fields ?

  • @Navin There are definitely ways do validate this after the annotators complete the task.

    My question here is about checking annotators during the annotation process. For some annotations, I would know the actual true answer, the gold standard. If an annotator makes too many mistakes on the gold standard, they get kicked out.

質問済み 2年前339ビュー
2回答
1

Just wanted to share that when the labels on a dataset need to be validated, Amazon SageMaker Ground Truth provides functionality to have workers verify that labels are correct or to adjust previous labels. These types of jobs fall into two distinct categories:

  1. Label verification — Workers indicate if the existing labels are correct, or rate their quality, and can add comments to explain their reasoning. Workers will not be able to modify or adjust labels. If you create a 3D point cloud or video frame label adjustment or verification job, you can choose to make label category attributes (not supported for 3D point cloud semantic segmentation) and frame attributes editable by workers.

  2. Label adjustment — Workers adjust prior annotations and, if applicable, label category and frame attributes to correct them.

The following Ground Truth built-in task types support adjustment and verification labeling jobs:

Bounding box Semantic segmentation 3D point cloud object detection, 3D point cloud object tracking, and 3D point cloud semantic segmentation All video frame object detection and video frame object tracking task types — bounding box, polyline, polygon and keypoint

Please refer to below link below to go in details for the above methods: https://docs.aws.amazon.com/sagemaker/latest/dg/sms-verification-data.html

AWS
回答済み 1年前
  • Thanks a lot for your reply. These are really great tools. Unfortunately, I have unlabelled data and I would like to assess the performance of annotators while their are doing their job.

0

Thank you for sharing those details about label verification and adjustment functionalities within AWS Ground Truth. While those features are valuable, I'd like to offer an alternative solution that we've implemented successfully for a similar scenario.

At labellerr, we developed a randomized verification pipeline that draws quality-accepted files, incorporating factors like similarity and diversity from a labeled pool. This allows for real-time, multi-consensus quality checks without annotators' prior knowledge, ensuring a high standard of annotations.

Additionally, our system utilizes past annotator performance signals to predict potential errors, enabling proactive identification and remediation of potential issues.

This approach has significantly optimized our quality control while maintaining efficiency in the annotation process. We found it to be effective in ensuring accurate annotations and streamlining the validation process.

回答済み 4ヶ月前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ