Go back

Take questions from the crowd

            

Citizen science can help to set research agendas, say Marion Poetz and Henry Sauermann

Crowd and citizen science projects, which involve the general public or specific subgroups such as patients as active participants in research, are becoming an established part of the academic landscape.

Involving citizens in research boosts productivity by allowing scientists to collect and analyse much larger data sets or to tap into the wisdom of the crowd. Such projects have resulted in publications in top-tier journals in a range of fields, from medicine to quantum physics.

Most crowd science projects involve citizens only after scientists have set the research questions and aims: they seek help with tasks such as image classification or problem solving. In other sectors, though, such as innovation and product development, studies have shown that the crowd can also perform well at identifying problems.

We wanted to know whether the same might go for research questions. The medical sciences are an obvious candidate, as there are large communities of engaged healthcare professionals and patients who understand the problems and existing solutions.

We analysed data from two crowdsourcing projects in mental health and traumatology (the study of wounds and injuries) sponsored by the Ludwig Boltzmann Society in Austria. These sought research questions from patients and their relatives, as well as medical practitioners such as nurses and physicians.

In the mental health project, participants could submit any number of questions, along with an explanation of why they were chosen. Excluding a small number of submissions that were incomplete or off-topic, our study included 753 questions from 155 citizens. Over two-thirds of these participants contributed more than one question, with the most prolific offering 86. In the traumatology project, each participant could submit only one suggestion; we analysed 151 questions.

Scoring the questions

To gauge the quality of citizens’ ideas, we compared them with typical professional research questions taken from academic conference proceedings in the same disciplines. We then asked independent professional scientists to score all questions, without knowing who had formulated them, on the dimensions of novelty, scientific impact and practical impact.

Citizens’ questions showed two interesting patterns. First, many restated general problems, such as “How can we speed up wound healing?” These show what citizens find important, but do not offer concrete research ideas.

Second, crowdsourced questions tended to be more interdisciplinary. As well as combining different areas of medicine, many also linked with broader socioeconomic contexts, such as “To what extent do accidental injuries impact the victim’s social life?”

On average, evaluators scored crowdsourced questions as less novel and less scientifically relevant than those from academics, but just as high or even higher on potential practical impact. Overall averages do not show the whole picture, however. In the mental health project, the best 20 per cent of crowdsourced questions outperformed the professionals on all three dimensions. In the traumatology project, the best 20 per cent were on a par with professional questions in their novelty and scientific impact, and outperformed them strongly on practical impact. 

The Ludwig Boltzmann Society has already funded three research groups to investigate crowd-generated questions—two on mental illness and one in traumatology.

Our results suggest that involving citizens in generating research questions can help steer projects in new directions and increase their relevance to real-life problems. Many academics study questions that affect citizens, such as in medicine, environmental sciences and economic development. But academics can go round in circles, looking at the same issues and trying similar solutions. Researchers should consider whether involving crowds in the conceptual stages of the research process could help them to be more effective.

Of course, the effort required to recruit participants and run crowdsourcing projects should not be underestimated. The more contributions a project generates, the smarter scientists have to be in setting up screening and evaluation mechanisms that identify the most promising ideas. And it is not clear how well crowdsourcing research questions works in fields further removed from citizens’ everyday experiences.

Even so, previous success at engaging the crowd further along the research process in more blue-sky disciplines suggests the public may have interesting and important ideas here as well. We are working on follow-up studies and a book to help scientists decide whether, when and how to involve citizens in research. 

Marion Poetz is an associate professor at Copenhagen Business School. Henry Sauermann is professor of strategy at the European School of Management and Technology in Berlin.

This article also appeared in Research Europe