Individual vs. Collaborative Methods of Crowdsourced Transcription - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Data Mining and Digital Humanities Année : 2019

Individual vs. Collaborative Methods of Crowdsourced Transcription

Résumé

While online crowdsourced text transcription projects have proliferated in the last decade, there is a need within the broader field to understand differences in project outcomes as they relate to task design, as well as to experiment with different models of online crowdsourced transcription that have not yet been explored. The experiment discussed in this paper involves the evaluation of newly-built tools on the Zooniverse.org crowdsourcing platform, attempting to answer the research question: "Does the current Zooniverse methodology of multiple independent transcribers and aggregation of results render higher-quality outcomes than allowing volunteers to see previous transcriptions and/or markings by other users? How does each methodology impact the quality and depth of analysis and participation?" To answer these questions, the Zooniverse team ran an A/B experiment on the project Anti-Slavery Manuscripts at the Boston Public Library. This paper will share results of this study, and also describe the process of designing the experiment and the metrics used to evaluate each transcription method. These include the comparison of aggregate transcription results with ground truth data; evaluation of annotation methods; the time it took for volunteers to complete transcribing each dataset; and the level of engagement with other project elements such as posting on the message board or reading supporting documentation. Particular focus will be given to the (at times) competing goals of data quality, efficiency, volunteer engagement, and user retention, all of which are of high importance for projects that focus on data from galleries, libraries, archives and museums. Ultimately, this paper aims to provide a model for impactful, intentional design and study of online crowdsourcing transcription methods, as well as shed light on the associations between project design, methodology and outcomes.
Fichier principal
Vignette du fichier
FINAL_Blickhan_IndividualvCollab.pdf (3.86 Mo) Télécharger le fichier
Origine : Publication financée par une institution
Loading...

Dates et versions

hal-02280013 , version 1 (05-09-2019)
hal-02280013 , version 2 (03-12-2019)

Identifiants

Citer

Samantha Blickhan, Coleman Krawczyk, Daniel Hanson, Amy Boyer, Andrea Simenstad, et al.. Individual vs. Collaborative Methods of Crowdsourced Transcription. Journal of Data Mining and Digital Humanities, 2019, Special Issue on Collecting, Preserving, and Disseminating Endangered Cultural Heritage for New Understandings through Multilingual Approaches, ⟨10.46298/jdmdh.5759⟩. ⟨hal-02280013v2⟩
1627 Consultations
1373 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More