Leveraging metacognitive ability to improve crowd accuracy via impossible questions.
https://webfiles.uci.edu/msteyver/publications/ImpossibleQuestions.pdf
https://osf.io/8r3t9/?view_only=c5a9694d4900431ab00566e124a10b1d
The aggregate of judgments across individuals can be quite accurate, especially when individuals with expert judgment can be identified. A number of procedures have been developed to identify expert judgments using historical performance or questionnaire data.
Here we measure expertise with the participant’s tendency to skip impossible questions. These questions have no correct answers and serve as a metacognitive measure of a participant’s ability to recognize when they lack knowledge.
In contexts where individuals choose which questions to answer, those who are selective about when to contribute to the crowd are valuable. We find that an individual’s propensity to skip impossible questions is related to their expertise and leverage these questions to form highly accurate crowds, outperforming other methods of identifying experts that rely on historical accuracy.