2021
DOI: 10.1038/s41598-021-83827-4
|View full text |Cite
|
Sign up to set email alerts
|

EXACT: a collaboration toolset for algorithm-aided annotation of images with annotation version control

Abstract: In many research areas, scientific progress is accelerated by multidisciplinary access to image data and their interdisciplinary annotation. However, keeping track of these annotations to ensure a high-quality multi-purpose data set is a challenging and labour intensive task. We developed the open-source online platform EXACT (EXpert Algorithm Collaboration Tool) that enables the collaborative interdisciplinary analysis of images from different domains online and offline. EXACT supports multi-gigapixel medical… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

4
5

Authors

Journals

citations
Cited by 35 publications
(18 citation statements)
references
References 21 publications
0
18
0
Order By: Relevance
“…Annotation data collection was acquired in a two-step process performed by three trained biologists. In the first step, they labeled each image in a competitive process (using the EXACT labeling tool [31]), where each expert created bounding box annotations for each animal, including a label for their identities, not knowing about the annotations made by the other experts. The global accordance metric after the first labeling round was R = 0.938.…”
Section: Labeling Processmentioning
confidence: 99%
“…Annotation data collection was acquired in a two-step process performed by three trained biologists. In the first step, they labeled each image in a competitive process (using the EXACT labeling tool [31]), where each expert created bounding box annotations for each animal, including a label for their identities, not knowing about the annotations made by the other experts. The global accordance metric after the first labeling round was R = 0.938.…”
Section: Labeling Processmentioning
confidence: 99%
“…The four test patches for evaluation were manually annotated into HER2 scored tumor areas, non-tumor areas and staining artefacts. These areas were segmented using polygons in the EXACT tool [10]. Annotations were performed by a medical student and reviewed by a board-certified pathologist.…”
Section: Datasetmentioning
confidence: 99%
“…Overall, annotations were gathered for seven canine cutaneous tumor subtypes as well as six additional tissue classes: epidermis, dermis, subcutis, bone, cartilage and a joint class of inflammation and necrosis. For database review, the open source online platform EXACT 27 was used to monitor slide and annotation completeness.…”
Section: Annotation Workflowmentioning
confidence: 99%