Please use this identifier to cite or link to this item: https://doi.org/10.1007/s10579-012-9176-1
DC FieldValue
dc.titlePerspectives on crowdsourcing annotations for natural language processing
dc.contributor.authorWang, A.
dc.contributor.authorHoang, C.D.V.
dc.contributor.authorKan, M.-Y.
dc.date.accessioned2013-07-04T07:42:54Z
dc.date.available2013-07-04T07:42:54Z
dc.date.issued2013
dc.identifier.citationWang, A., Hoang, C.D.V., Kan, M.-Y. (2013). Perspectives on crowdsourcing annotations for natural language processing. Language Resources and Evaluation 47 (1) : 9-31. ScholarBank@NUS Repository. https://doi.org/10.1007/s10579-012-9176-1
dc.identifier.issn1574020X
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/39496
dc.description.abstractCrowdsourcing has emerged as a new method for obtaining annotations for training models for machine learning. While many variants of this process exist, they largely differ in their methods of motivating subjects to contribute and the scale of their applications. To date, there has yet to be a study that helps the practitioner to decide what form an annotation application should take to best reach its objectives within the constraints of a project. To fill this gap, we provide a faceted analysis of crowdsourcing from a practitioner's perspective, and show how our facets apply to existing published crowdsourced annotation applications. We then summarize how the major crowdsourcing genres fill different parts of this multi-dimensional space, which leads to our recommendations on the potential opportunities crowdsourcing offers to future annotation efforts. © 2012 Springer Science+Business Media B.V.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1007/s10579-012-9176-1
dc.sourceScopus
dc.subjectAnnotation
dc.subjectCrowdsourcing
dc.subjectGames with a purpose
dc.subjectHuman computation
dc.subjectMechanical Turk
dc.subjectNLP
dc.subjectWikipedia
dc.typeArticle
dc.contributor.departmentCOMPUTER SCIENCE
dc.description.doi10.1007/s10579-012-9176-1
dc.description.sourcetitleLanguage Resources and Evaluation
dc.description.volume47
dc.description.issue1
dc.description.page9-31
dc.identifier.isiut000316004500002
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.