Pages

Thursday, February 16, 2012

Motivation, Motivation, Motivation

Estate agents will tell you that the most important thing in buying a property is location, but what about citizen science projects? Is it motivation? What is important? Also who do you know if you're doing it right? The second session at this week's London Citizen Cyberscience Summit introduced project's looking at motivation, evaluation and engagement to answer some of these questions. 

Andrea Wiggins from Syracuse University
Up first was Andrea Wiggins (@AndreaWiggins) from Syracuse University near NY (soon to be Dr Wiggins) with her talk Motivation by Design: Technologies, Experiences, and Incentives. She has looked at many projects like eBird and The Great Sunflower Project to investigate what "makes" or "breaks" a project. Her work looked at what rewards were being offered, the activities the volunteers were undertaking and evolution of the projects (both through demands from the volunteers and the science).

Her take home points were:
  • If personal and scientific interests of the volunteer are satisfied by the project it is easier to attract people.
  • It is important to promoting satisfaction and commitment among the participants
  • Data entry is dull, but a usable interface can combat this
  • Visualisation of the part played by the volunteer is, anecdotally, a motivator
  • Introducing an element of gaming to the activity works very well especially on smart phones where people can engage during a short downtime for them (e.g. in a queue)
  • Think about how to reward and expand the volunteers experience at the start of a project.
  • You will need people to manage the project, pay for them
  • Think about yous audience, align them with their interests
  • Good design of the tech/usability is not optional. Doing it upfront is cheaper in the long run
Tina Philips of Cornell University
The second speaker was also based in New York but decided to give her talk slightly differently. Tina Philips from Cornell's Lab of Ornithology was unable to attend in person so she joined us via technology so as we followed her slides on the big screen her disembodied voice led us through Motivation, engagement, and learning outcomes. She discussed the impact that the explosion of citizen science projects has had, with over 1,000 papers being generated for science what has it done for the public?

After looking at the existing projects it was obvious that in depth evaluation was almost non-existent. However there was strong proof that high engagement led to deeper learning. This led to the DEVISE (Developing, Validating, and Implementing Standardized Evaluation Instruments) project which aims to improve evaluation quality and capacity to help inform citizen science and and how it can be used for informal sci education. They have now built a framework of evaluating individual outcomes for any project depending on their needs and it is being field tested this year.

 Ofer Arazy, University of Alberta
The final speaker of the session was Ofer Arazy, University of Alberta. He has been looking at how citizen science compares to peer-production projects (like Wikipedia or open source software) which have been extensively studied. His team's focus was on Stardust@home but also the Citizen Weather Observer Program.  They studied the motivational drivers and their effect on the quantity and quality of contributions made by volunteers. In the open source software arena, the main driver is the creation of a product they use. This is however different to citizen science, their motivations range from the "selfish" (gaining reputation) to identification (I am part of the community). The team tried to pick apart how these effect the project. After surveying the volunteers to find motivations they correlated these with the quantity and quality of data provided by them. There were many complex relations hops but what was important was that quality and quantity were positively affected if their part of the community and contributing to something seen as worthwhile by their peers. Howveer they also saw that if the rewards on offer were based on quantity then quality suffers and vice versa.

What I really learned from this session is that while knowing what you want as a scientist is, for citizen science you need to know what "citizens" want and that is not easy.

No comments: