Slides of the panel presentation of the Accurator nichesourcing framework at MCN 2014. You can find a trip report of MCN 2014 here.
Cultural Heritage domain has opened up to contributions from the users on the web. The contributions are mainly in the form of tags which describe certain aspect of the cultural heritage object. With a wide range of users on the web, it becomes important to determine the quality of the user contributed content before it is published online. However, manually evaluating the quality of these user generated contributions is exhausting in terms of resources for the Cultural Heritage institutions. In this talk, I will describe methods which can semi-automatically predict the quality of tags. These methods address three research questions: How can we trust an online contributor?, How can we assess the quality of annotation process? and How can we trust the contributed data?. The slides for the presentation can be found here.
Large datasets such as Cultural Heritage collections require detailed annotations when digitised and made available online. Annotating dierent aspects of such collections requires a variety of knowledge and expertise which is not always possessed by the collection curators. Artwork annotation is an example of a knowledge intensive image annotation task, i.e. a task that demands annotators to have domain-specic knowledge in order to be successfully completed. Today, Lora Aroyo will present WebSci2014 conference the results of a study aimed at investigating the applicability of crowdsourcing techniques to knowledge intensive image annotation tasks. We observed a clear relationship between the annotation difficulty of an image, in terms of number of items to identify and annotate, and the performance of the recruited workers. Here you can see the poster and the slides of the presentation.