What is it that has driven thousands of members of the public, with no discernible interest in pursuing a career in research, to spend their Saturday afternoons reading hundreds of ships’ logbooks and entering their findings into an online database?
Likewise, why are more than 800,000 people registered members of Zooniverse, an online portal that enables them to help researchers by, among other things, listening to and describing bat calls, sifting through 30 years’ worth of tropical cyclone data, or transcribing ancient Greek manuscripts?
The concept is simple. Certain datasets cannot be processed by computers - the information is too difficult to be read by a machine. Some datasets are massive, such as the ships’ logbooks, which number in the hundreds of thousands in the UK alone. The solution is to make that data available online and ask members of the public - or citizen scientists - to crunch it.
Such online “crowdsourcing” to facilitate research has been around for a while. The Galaxy Zoo project, for example, has been drawing on the wisdom of the crowd to map outer space since 2007, giving thousands of amateur astronomers access to images from the Hubble Space Telescope.
Now researchers are hoping to discover what it is that makes people interact with crowdsourcing research projects, and how to keep them coming back for more.
A team headed by Joe Cox, an economist at the University of Portsmouth, has been awarded a £750,000 grant by the Engineering and Physical Sciences Research Council to establish why people give up their time to help scientists so willingly, and to investigate how strategic interventions can help to optimise and sustain people’s interactions.
“There is existing literature on the economics of volunteering, but it has got to the point where it is showing its age,” Dr Cox said. “If we don’t understand the nature of the resources we have - the volunteers - and how to manage them effectively, it is more difficult to maximise the value they generate.”
Kevin Wood is a climate scientist at the University of Washington’s Joint Institute for the Study of the Atmosphere and Ocean. He works on the US arm of Old Weather, the Zooniverse project that comprises the gathering of billions of weather observations recorded in ships’ logbooks in order to better understand climate change.
He agrees that a better understanding of what motivates citizen scientists would be useful because participants do not always behave as expected.
“Our volunteers are only asked to input the weather observations,” Dr Wood said. “However, some people are transcribing crew lists, records of the things coming on and off boat, people lost at sea, sightings of Aurora - even the amount of coal and oil consumed.”
One of the concerns that Dr Cox’s research could address is a fear of overkill. As more and more research comes online, demanding more and more of people’s time, is it likely that only the most effective and engaging projects will be able to survive?
“I would be concerned about saturation and too many of these things starting to overwhelm people,” Dr Cox said. “But the evidence shows that when Zooniverse launches a project, overall participation rates go up - not just for the new project but also for existing projects.”
That worry was echoed by Chris Lintott, Zooniverse director, University of Oxford researcher and citizen science project lead, and a partner on the EPSRC-funded research. But he added that there was a more pressing need for research into what keeps citizen scientists coming back for more.
“What I want to be able to do is detect if [someone is] particularly good at seeing a particular thing - for example, faint fuzzy galaxies - so we can tailor the experience,” Dr Lintott said. “However, the solutions that exist don’t take into account the fact that we have real people doing the work, and real people get bored. You may be the best in the world at seeing faint fuzzy galaxies, but if I just gave you an unremitting diet of faint fuzzy galaxies, you’d probably leave.”
Make a game of it
One possibility being explored by Cancer Research UK is the “gamification” of crowdsourcing - surreptitiously getting people to help with research while they are doing something enjoyable.
The charity has already had success with Cell Slider, a crowdsourcing project that asks people to view images of breast cancer samples and identify irregularities. It is not “gamified”; it simply presents users with a steady string of images to analyse.
Despite its success (in its first three months, citizen scientists completed a dataset that would have taken researchers 18 months), the charity’s next project will take a different direction, aiming to produce a smartphone game in which players will have to identify changes in genes that make up breast cancer tumours.
Amy Carton, citizen science lead at the charity, believes this approach could help to reach people who are not sufficiently excited by the idea of helping with research to take part in existing crowdsourcing projects.
“We already have a ‘sciencey’ audience that we reach through Cell Slider, but there is this whole other audience - little brothers, best mates, parents, grandparents - who all have some time spare,” she said. “Instead of playing solitaire or Angry Birds, why not have them playing our game?”
The aim is to create something so appealing that participants do not have to know that they are helping cancer researchers. “We want them to play this game for the game’s sake - so that they’ll keep coming back,” Ms Carton added.
The charity hosted a three-day GameJam in March. This brought together gamers, programmers, graphic designers and other specialists from organisations including Facebook, Amazon and Google in London to develop ideas. Twelve potential games emerged, and it is hoped that one of them will be available by the autumn.
Not everyone agrees with Cancer Research UK’s approach. Having explored a similar approach on some early Zooniverse projects, Dr Lintott has reservations about making crowdsourced research more akin to gaming.
“We’ve done some trials where we experimented with giving feedback in the form of points, and we found that while some of the people who weren’t very good did look to improve, others simply left,” he said.
“However, the best people were systematically leaving, because once you switch into collecting points, and you find you’re winning the game, then you get the impression that you’ve finished - mastered it - and you put the game down. We’d essentially built a system that drove away our best people.”
Although Dr Lintott acknowledged the potential value in gamification, particularly in reaching an audience that is not currently engaging with crowdsourced research, he is wary of the difficulties it presents.
“With gamification you need to be in a space where you know what the right answer is, so that you can give people points if they find it. But in a lot of the projects we run, you can’t do that because we don’t know the right answer until we’ve had tens of people look at it.
“Also, there’s a danger of clumsy gamification. If I’m right, and people are doing this because they want an authentic experience of doing research, then making it a bit game-like could destroy that.”