Today's Hours: 12:00pm - 6:00pm

Search

Filter Applied Clear All

Did You Mean:

Search Results

  • Article
    Bafti SM, Ang CS, Hossain MM, Marcelli G, Alemany-Fornes M, Tsaousis AD.
    Comput Biol Med. 2021 03;130:104204.
    State-of-the-art computer-vision algorithms rely on big and accurately annotated data, which are expensive, laborious and time-consuming to generate. This task is even more challenging when it comes to microbiological images, because they require specialized expertise for accurate annotation. Previous studies show that crowdsourcing and assistive-annotation tools are two potential solutions to address this challenge. In this work, we have developed a web-based platform to enable crowdsourcing annotation of image data; the platform is powered by a semi-automated assistive tool to support non-expert annotators to improve the annotation efficiency. The behavior of annotators with and without the assistive tool is analyzed, using biological images of different complexity. More specifically, non-experts have been asked to use the platform to annotate microbiological images of gut parasites, which are compared with annotations by experts. A quantitative evaluation is carried out on the results, confirming that the assistive tools can noticeably decrease the non-expert annotation's cost (time, click, interaction, etc.) while preserving or even improving the annotation's quality. The annotation quality of non-experts has been investigated using IoU (intersection over union), precision and recall; based on this analysis we propose some ideas on how to better design similar crowdsourcing and assistive platforms.
    Digital Access Access Options