Creation of Reliable Relevance Judgments in Information Retrieval Systems Evaluation Experimentation through Crowdsourcing: A Review

المؤلفون المشاركون

Samimi, Parnia
Ravana, Sri Devi

المصدر

The Scientific World Journal

العدد

المجلد 2014، العدد 2014 (31 ديسمبر/كانون الأول 2014)، ص ص. 1-13، 13ص.

الناشر

Hindawi Publishing Corporation

تاريخ النشر

2014-05-19

دولة النشر

مصر

عدد الصفحات

13

التخصصات الرئيسية

الطب البشري
تكنولوجيا المعلومات وعلم الحاسوب

الملخص EN

Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation.

In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task.

Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems.

Crowdsourcing as a novel method of data acquisition is broadly used in many research fields.

It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments.

One of the crowdsourcing applications in IR is to judge relevancy of query document pair.

In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control.

This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.

نمط استشهاد جمعية علماء النفس الأمريكية (APA)

Samimi, Parnia& Ravana, Sri Devi. 2014. Creation of Reliable Relevance Judgments in Information Retrieval Systems Evaluation Experimentation through Crowdsourcing: A Review. The Scientific World Journal،Vol. 2014, no. 2014, pp.1-13.
https://search.emarefa.net/detail/BIM-1048415

نمط استشهاد الجمعية الأمريكية للغات الحديثة (MLA)

Samimi, Parnia& Ravana, Sri Devi. Creation of Reliable Relevance Judgments in Information Retrieval Systems Evaluation Experimentation through Crowdsourcing: A Review. The Scientific World Journal No. 2014 (2014), pp.1-13.
https://search.emarefa.net/detail/BIM-1048415

نمط استشهاد الجمعية الطبية الأمريكية (AMA)

Samimi, Parnia& Ravana, Sri Devi. Creation of Reliable Relevance Judgments in Information Retrieval Systems Evaluation Experimentation through Crowdsourcing: A Review. The Scientific World Journal. 2014. Vol. 2014, no. 2014, pp.1-13.
https://search.emarefa.net/detail/BIM-1048415

نوع البيانات

مقالات

لغة النص

الإنجليزية

الملاحظات

Includes bibliographical references

رقم السجل

BIM-1048415