6. Understanding and connecting to participant motivations
by Mia Ridge, Samantha Blickhan, Meghan Ferriter, Austin Mast, Ben Brumfield, Brendon Wilkins, Daria Cybulska, Denise Burgher, Jim Casey, Kurt Luther, Michael Haley Goldman, Nick White, Pip Willcox, Sara Carlstead Brumfield, Sonya J. Coleman, and Ylva Berglund Prytz
Published onApr 29, 2021
6. Understanding and connecting to participant motivations
Understanding and connecting to participant motivations
In this chapter we draw on research and our experiences of inviting and working with participants to highlight the broad range of motivations that bring people into crowdsourcing projects. We believe it is important to be aware of the many reasons people may wish to contribute to your project. This information is critically important in designing user experiences and essential to designing journeys through your project. What motivates people to come into crowdsourcing may be different from what has interested you in it in the first place, and it also varies from person to person.
We outline various dimensions of motivations, touching particularly on a sense of having an impact, creating a sense of competition, and reward and recognition for participation. We also pause to consider paid contributions and the tensions that they bring within heritage crowdsourced projects.
Following this discussion, we share examples of projects and approaches that support a range of motivations, orienting groups of participants toward collective goals, and ways to thoughtfully connect individual and project progress metrics with participant activities.
As a rule of thumb, most motivations for participation in cultural heritage crowdsourcing can be grouped into extrinsic, intrinsic, and altruistic motivations.1 Extrinsic motivations might include tangible rewards, activities (such as games) that are so enjoyable they do not need to draw on intrinsic motivation, a score or grade, or additional points discussed below. Intrinsic motivations — such as fun, socializing, being part of a community, or an interest in the subject — make participation inherently rewarding. Altruistic motivations can be ideological or collective. Alam and Campbell found that motivations for participation change over time, with personal/intrinsic and “community-centric motivations (i.e., altruism and non-profit cause)“ being important for recruiting participants to projects, with external motivations (i.e., recognition and rewards) having a greater impact on long-term participation.2
“I like the opportunity to do something that actually contributes to science. I always wanted to pursue science as a career, but that didn’t work out. Now that I’m retired, I love using my new free time to work on actual scientific research.”3
Jane McGonigal summarises much of the literature about motivations for voluntary work in her memorable overview of “what humans crave” and “what museums give us”:4
Before writing this book, we ran a small-scale survey shared on social media, listservs, and crowdsourcing platform blogs, asking project volunteers, “What makes a crowdsourcing/citizen research/online volunteering project great?”5
Responses that elaborate on the four points above included:
“I love the [projects] that I can connect with my daily life. I’ve worked on one classifying animals found on trail cams at a wildlife park near my home. I’ve also been working on one transcribing old survey information. I work regularly with surveyors, so this was fun and it was really neat to see how little has changed.”
“For me, projects that require more than just data entry are more interesting. If I’m going to spend time on a project, I want to be intellectually challenged. This is why I worked on Shakespeare’s World to the bitter end and why I continue to this day with the Folger Shakespeare Library to transcribe 16th and 17th Century handwritten British documents. That project required volunteers to develop a set of skills, resolve unknowns, and help each other to decipher 500-year-old handwriting. With the Ancient Lives and Cairo Geniza projects, I found myself neck-deep in ancient Hebrew and Greek texts, and since am not a native speaker of either, both the challenge and reward were satisfying.”
“A great project supports and encourages engagement and contact not just between the volunteers and the folk running the project but also actively supports and encourages engagement between its volunteers. The social aspect of a crowdsourcing project is yet another of the intangible rewards offered to volunteers who give their free time and effort to participate in a crowdsourcing project.”
“I love the feeling that I am making a difference and I can see a purpose in the results.”6
Whatever motivations bring someone to your project, it is important to acknowledge and understand them as much as possible while reserving judgment. Participants are not a monolith with the same motivations and needs. Someone who needs documented community service or internship hours is just as important as someone interested in genealogy; we all bring our own set of motivations to a project. Understanding the motivations of participants is incumbent on the project team. If you do not know why someone is interested in your project, it will be more difficult to make sure they have a good experience and feel appreciated.
Spending time getting to know individual participants will shed light on their sometimes unique motivations, but there are some commonalities drawn from our experience and confirmed by academic studies, including previous work on motivation summarized in Ridge’s7 discussion of the role of crowdsourcing in deeper engagement with disciplines and related skills and knowledge. The field of citizen science has produced extensive research and material on learning through participation in crowdsourcing and other participatory projects.8
Perspective: a matrix of motivations — Meghan Ferriter, Library of Congress (formerly Smithsonian Transcription Center)
In 2016, my colleague Christine Rosenfeld and I collaborated with Smithsonian Transcription Center9 volunteers to discuss, reflect on, and write an article about their shared motivations for the frequency and regularity of their contributions.10 I’d been in my role as Project Coordinator for about a year-and-a-half when we talked through these motivations. The thoughtfulness, enthusiasm, and provocation that Dana Boomer, Carla Burgess, Siobhan Leachman, Victoria Leachman, Heidi Moses, Felicia Pickering, and Megan Shuler brought to the conversations and resulting analysis rooted many of my beliefs about the necessity of understanding and supporting the pursuit of motivations when managing these types of projects. More importantly, however, these participant researchers brought into clear focus the practical fuzziness and constantly changing formation of motivations in cultural heritage crowdsourcing. While these perspectives were grounded in the particular context of the Smithsonian Transcription Center, the takeaways resonate with the broader research described in this chapter and elsewhere.
Before I touch on the research results, I should remind readers that the stated goal of the Smithsonian Transcription Center is to engage the public in making collections more accessible; inherently this goal includes encouraging people to connect in meaningful ways with those collections and what they contain. In these 2015 conversations, five themes emerged from discussions around the question “Why do volunteers participate in the [Smithsonian Transcription Center]?” The themes were: For Personal Goals, To Be a Part of Something Bigger, To Belong, To Do It Well, and Based on the Atmosphere and Interface. The participant researchers frequently mentioned more than one motivation concerning another and described how changing tasks and topics found them moving between motivations, and even in the same visit to the site. Together these findings highlighted the complexity of engagement and of the motivations for participating. From there it was my job to continue to build upon what worked well and find ways to accommodate and support the motivations I had not deeply considered to that point. No matter their motivations for arriving at and contributing to the Smithsonian Transcription Center, I had the privilege of watching individuals transform their interests and lives through crowdsourcing; indeed, I count myself among those transformed through deep engagement in cultural heritage crowdsourcing.
Linking motivations to design choices and activities
Understanding likely motivations for participation is vital. It helps you write text that connects to those motivations to both recruit and retain participants over time. From some of our experience, some common elements participants need include: communication and updates on project progress, a mission or objective to pursue, support in the form of instructions, reasons to conform to the instructions, guidance when encountering outliers, freedom to connect project elements to their motivation(s), opportunities to grow, and time to learn.11 Each of these are points at which you could link to common motivations to encourage or deepen participation.
It has been our experience that, as a whole, people are wonderful. People are kind and helpful and passionate about helping; people love contributing to projects. When it comes to unearthing our own (and others’) cultural history, people often understand the intrinsic value of our projects and are eager to get involved. The work is often incentive enough. For this reason, be very careful when you offer additional incentivization — especially external incentivization — when it is not necessary. You may find yourself locked into those incentives to keep contributors, even though the work was engaging enough to offer intrinsic motivation without incentives.
Reporting on impact
As discussed in the “Evaluating your crowdsourcing project” chapter, we advocate for sharing the results of the work — including data and stories about its impact — with your participants, in part because it is immensely rewarding and taps into mutual motivations. In our experience, sharing even tangential research that made use of a project’s results is met with gratitude. Showing participants how their contributions have helped make collections more easily findable and usable by others reinforces altruistic motivations. For example, in an article written collaboratively by lead researchers and volunteers, seven frequently-contributing Smithsonian Transcription Center volunteers defined feedback about research utility of the data they created as a motivating factor for their participation; connected to a desire to “be a part of something bigger” and part of the opportunity to benefit others and humanity.12
“Open reuse status of contributions, Engagement between the provider and the contributor, ability to see how the provider uses your work and how your work makes an impact / makes the world better.”13
Providing broader context can demonstrate the value of your participants’ work in ways that they might not have realized. Data visualizations, even simple ones, can demonstrate potential uses of the crowdsourced data and encourage participants to continue contributing.
Input into future development
Offering feedback mechanisms to participants is a certain type of “reward,” too. Welcoming their input into the development of your project is a way to honor their time and commitment. You can also let your participants influence which material is digitized and made available to work on next, perhaps via a survey. For example, the African-American Civil War Soldiers project, which invited participants to help transcribe enlistment records from the approximately 200,000 soldiers who formed the United States Colored Troops,14 divided and released records to participants state by state. The project team used the message board to set up a poll, asking participants which state records they wanted to transcribe next.
Practicing iterative design and/or creating a user group for testing new features or developments can greatly benefit the institution as well as the participants who have requested these updates. Advisory boards are another way to leverage the expertise and enthusiasm of your most invested volunteers and represent their interests in other aspects of the project. See the “Choosing tasks and workflows” and “Managing cultural heritage crowdsourcing projects” chapters for more details on building levels of engagement into your project.
Though many participants will be happy to continue working as part of the crowd, some will approach the project team with additional suggestions or requests. Creating a plan to continue to engage those “over and above” volunteers will ensure that they feel valued and deepen their involvement. If the crowdsourcing platform you are using allows it, creating an advanced user group to assist with tasks such as review or approval can improve the workflow while acknowledging the expertise and value of participants. This can also help retain participants who have mastered the skills required for the main task and who want a new challenge.15
While participating in a crowdsourcing project can be very educational over time, additional instructive materials can be created by project staff to aid skill development and education (see the “Supporting participants” chapter for more detail). Crowdsourcing projects are often designed to welcome all skill levels, as much as possible, but some talents will develop with practice, such as reading old handwriting. Skill development is therefore a strong potential motivator for participation.
Perspective: recruiting hundreds-at-a-time, rather than one-at-a-time, for WeDigFLPlants — Austin Mast
I serve as Director of Florida State University’s Robert K. Godfrey Herbarium — a library of over 225,000 plant specimens collected over the last 150 years from Florida and elsewhere in the world. Think of it as a natural heritage collection that has many parallels to a cultural heritage collection. Each plant specimen is flattened, dried, and mounted on archival paper with a label that provides the who, what, where, and when of the collecting event. There exist about 3 billion natural heritage specimens of this type (not just plants on sheets, but also fish in jars, fossils in drawers, and insects on pins) worldwide. Our collections community would like to digitally represent this natural heritage in ways that can be aggregated and used for urgent research, conservation, management, policymaking, and other purposes. We view crowdsourcing as one way to advance towards that goal, and we recognize that it offers a valuable opportunity to layer resources on the activity to simultaneously advance science literacy. These layered resources can include interpretive talks, newsletter articles, blog posts, forum dialog, lesson plans, live or virtual tours of the collections, and even games like habitat bingo.
Since 2016, I have been organizing a project focused on producing these resources for participants involved in transcribing label data for a particular set of natural heritage specimens — those plants collected in Florida, irrespective of where in the world they are curated. The project is called WeDigFLPlants.16 I circumscribed it in that way for a very specific reason: I wanted to experiment with recruiting participants not one-at-a-time but hundreds-at-a-time. There were four plant enthusiast groups that I had in mind: Florida Native Plant Society, Florida Master Gardeners, Florida Master Naturalists, and Florida Wildflower Foundation. The goal of WeDigFLPlants is to produce the most complete historical baseline possible for Florida’s plants to put today’s diversity and distribution in context and predict future changes. I recognized that this might resonate with the plant enthusiast groups and organized a workshop in 2017 involving the leadership from each organization to brainstorm ideas that would advance both WeDigFLPlants and their organization’s missions.
Ideas preferred by those leaders were a mix of resources likely to advance science literacy and those that would be fun: onsite transcribing events at collections that were complemented by collection tours, branded gifts that would have value for their social signaling, and a framework for chapters or counties to compete in their participation. We now give WeDigFLPlants stickers as a token of appreciation to those who have completed 50 transcriptions and WeDigFLPlants hats to those who have completed 200. Onsite events occur during the October Worldwide Engagement for Digitizing Biocollections event each year. And we regularly produce scoreboards for our WeDigFLPlants Team Challenge on the BIOSPEX platform (software produced in support of this type of activity by iDigBio, the US National Science Foundation’s National Resource for Advancing Digitization of Biodiversity Collections). As of April, 2021, just over 3,200 WeDigFLPlants participants have completed 170,200 label transcriptions.
The most important lessons that I have learned in this experiment in recruiting hundreds of participants at a time are that it works best when preceded with the recruitment of a project champion in the complementary organization’s leadership, that resources bring deeper meaning to the activity — not at the moment that they are shared but at the moment that they are used — and considerable energy can be required to catalyze their usage, and that healthy relationships between a crowdsourcing project and the leadership of a complementary organization are built over years, rather than days or months.
Someone who begins as a casual participant may find that they become increasingly interested and invested in a crowdsourcing project over time. Rather than squander that enthusiasm, we must welcome and direct it in ways that are productive and beneficial for both the participant and the institution. Very dedicated participants may want to consider a career in GLAM (galleries, libraries, archives, or museums) or research related to the arts and humanities. Those already working in the GLAM fields may become interested in running their own project — perhaps even through reading this book! — or advancing the notion of public participation in other ways.
Most project contributions fall into the 80/20 rule of behavior, where 80% of the work is done by 20% of the community17 — or a much, much higher 95/5 rate (more discussion on this is available in the “Roles and learning pathways” section of the “Choosing tasks and workflows“ chapter). Projects should be designed to support both casual, and even passing, participants, as well as those with more time to spend contributing.
One design decision that can reduce the number of casual or one-off contributions is requiring participants to register for an account in advance. Forcing users to sign up before they can contribute creates a significant barrier to entry.18 This will reduce the number of participants, but that may not be a bad thing, because the percentage of more committed participants will increase and you will have the option to contact them by email if necessary (provided that a valid email address is a requirement for signing up). User accounts come with anonymity drawbacks, however, as well as concerns about data ethics and security. Allowing users to optionally log in to such an account is also an option. Zooniverse projects allow you to contribute without signing up, but to mitigate spam posting or harassment, only logged-in participants can post on discussion boards. If your project would benefit from having fewer, but more practiced and experienced participants, requiring sign-in may be the right decision for you.
Acknowledging and crediting participants
People are often proud of their work — and they deserve to be! — so recognition is vital. Many contributors enjoy seeing a record of their contributions, whether that is a list of pages that they have transcribed or just a number that they like seeing rise. Providing users with the ability to create accounts that have histories and some kind of contribution tracking can be valuable motivation. FromThePage and Zooniverse, for example, allow transcribers to set a “real name” or “preferred name” attribute that is used to credit transcribers or translators in publications or digital library systems. If there is no “real name” attribute, credit is given to the username.
When there are thousands of participants (whether logged-in or anonymous), it can be impossible to individually recognize each person. Consider your expression of gratitude carefully and choose what works best for your project. In-person events (e.g., transcribeathons) may involve serving snacks or a meal but also allow participants to witness genuine gratitude in person. Additionally, some projects carry acknowledgment through to their resulting data, including it in a creator field or other metadata that remains associated with the source material and/or as a credit line posted publicly on the organization’s website.
“The project managers are wonderful and prompt in responding to questions. They are so appreciative. I feel like I am part of a community.”
“A recognition of the expertise that each person brings to the project is really important, too — on Railway Work, Life & Death we’ve gained so much through listening and working with volunteers and contributors from different research backgrounds (academic, enthusiast, family history, local history, current rail industry… and on). We couldn’t possibly cover the variety we’ve managed to and to make that research public, without the input of the many volunteers.”19
Case study: bringing motivations and activities together at the Library of Virginia
The Library of Virginia hosted a yearly Transcrib-aversary event to mark the anniversary of the launch of their transcription program. Whereas transcribeathons were often smaller focused work sessions, this big yearly Transcribe-aversary was a celebration of volunteers and an opportunity for them to connect with staff as well as each other. Volunteers are invited to the library for programming related to the crowdsourced project material, fed boxed lunch and (project) birthday cake, offered tours of the collections and exhibitions, and provided with t-shirts and buttons as symbols of gratitude. Data visualizations and other applications of the crowdsourced results are shown. A Talk Back session encourages participants to voice their experiences to staff and each other, providing critical feedback about the direction of the project.
Extrinsic motivations: grades, service hours, payments
Some extrinsic motivations play a role in planning and managing a project even though most cultural heritage crowdsourcing projects rely heavily on intrinsic and altruistic motivations. As discussed in the “Supporting participants” chapter, project teams should consider the importance of academic grading or marks for student motivation. Below, we also briefly address the implications of service hours and payments as extrinsic motivations.
Designing for participants needing service hours
Participants who need community service hours may be drawn to crowdsourcing projects for extrinsic reasons but they still represent an important part of the community. Court-mandated community service, high school service requirements, and fraternal orders or sororities will often require documentation of service hours completed by participants. Challenges such as transportation or physical ability level may make remote participation more desirable as a community service activity. This was particularly true during the COVID-19 pandemic, when in-person service activities carried additional risks to participants or were unavailable due to community restrictions and institutional closures.
When service work is completed independently rather than at events (either in-person or virtual), accommodating participants who need documented hours will require your platform to have some form of time-tracking capability. If this capability is not built into your platform, you and your team may choose to incorporate your own informal methods — if this is the case, it is important to communicate this to your participants early, so they can make sure your methods will be suitable for their service needs. Even after the COVID-19 pandemic subsides, remote work and remote volunteering may continue at increased rates, so crowdsourcing projects should anticipate the needs of those participants as much as possible.
Financial rewards in a largely voluntary and not-for-profit field raise tricky questions. There are very occasional examples of cultural heritage crowdsourcing projects that offer credits towards related commercial products, such as digitizing specific items,20 but they still largely draw on models of intrinsically or altruistically motivated volunteerism. On Wikipedia, for example, paid editing is strongly discouraged as it is seen to unduly influence editors’ ability to maintain a neutral point of view.21
In some circumstances, financial rewards may be the wrong kind of incentive, and may de-motivate participants as the incentives are unlikely to match the time spent on the task. Financial rewards may also cause voluntary contributions to feel transactional. In cases where the work is especially challenging or highly repetitive but relying purely on machine learning or other software is not (yet) viable, classic outsourcing may be a perfect way to handle it. However, a shift to purely paid work must be considered carefully to understand the impact of removing values related to public engagement from the work, particularly if a project still has voluntary components.
Publishing statistics and enabling collective competition
Somewhere between intrinsic and extrinsic motivations lie different types of competition, collective goal setting, and gamification (the use of game design elements to motivate participation). Below, we provide caution around designing gamification into the core of your project design or interface.22 However, many projects benefit from a carefully designed competition or shared goal-setting to help move the project toward its goals. Further discussion of these methods can be found in the “Planning crowdsourcing events” chapter.
In the “Designing cultural heritage crowdsourcing projects” and “Evaluating your crowdsourcing project” chapters, we encourage project teams to consider how they will measure progress toward their goals. Being able to assess progress is also important to participants who may wish to understand what the project is achieving, as well as the ways their contributions are stacking up. Some people may even wish to see their own activity relative to the activity of others or the range or mean of activity rates across the project.
Challenges and collective goals
In the “Choosing tasks and workflows” chapter, we discuss the ways that projects might benefit by including indications of progress toward goals in their web interface. Project progress bars are not inherently competitive, but their lingering incompleteness can spark action from project participants who want to see them completed.
Another effective approach for achieving your project goals is to issue a challenge to your volunteer community: how long will it take us to transcribe this collection? How many tags can we add to that collection in a week, a weekend, or during a holiday break? The exact volume or contours of a campaign or challenge can vary according to your project needs, but SMART goals (explained in detail in the “Designing cultural heritage crowdsourcing projects” chapter) can help you drive toward success. Challenges to reach targets set by the project by a specific time have been used by the Smithsonian Transcription Center, Dickens Journals Online, and the 1940 US Census transcription project.23 Campaigns around specific topics can help potential contributors find a subject of interest within larger projects and may help projects get coverage in the media.
Case study: suffrage campaigns at the Library of Congress
The Library of Congress’ By The People platform created a sustained campaign to mark the Women’s Suffrage Centennial in the United States. To mark the 100th anniversary of women winning the right to vote in the United States, they organized a campaign called Suffrage: Women Fight for the Vote that combined six disparate but related collections. The campaign called fresh attention to these materials and fostered a larger collective sense of shared purpose. The Library of Congress’ suffrage campaign invited people to affirm their belief in voting rights and women’s rights by participating in this campaign. Campaigns and challenges are infinitely flexible approaches, and we invite you to experiment freely to find approaches that help people connect with your project in ever more meaningful ways.24
Case study: collective goals for the community through WikiLovesMonuments
WikiLovesMonuments is an annual international photographic competition, organized worldwide by Wikipedia community members with the help of local Wikimedia organizations (affiliates) across the globe. Participants — some from within the community, but many newcomers — take pictures of local historical monuments and heritage sites in their region, and upload them to the free media repository Wikimedia Commons. The event aims to highlight heritage sites of the participating countries by encouraging people to capture pictures of these monuments and to share them under a free license which can then be re-used not only on Wikipedia but everywhere by everyone.
If you have established an online platform for crowdsourcing activities, and that platform continues over time, this can help to create some sense of momentum for your contributors. In 2016, for example, Lesley Parilla and Meghan Ferriter explored the impact of challenges on volunteer engagement around transcription of Field Book Project scientific field journals in the Smithsonian Transcription Center. The frequency of these SMART challenges — approximately every 6 weeks — and modes of engagement including direct email, shared social media spaces, and coordinated messaging resulted in spikes of activity followed by a “maintaining” period between these structured experiences. Parilla and Ferriter theorize that the rhythm of these challenges and campaigns help bridge activity and integrate new volunteers as others move on to other activities.25 Additionally, challenging participants to complete sets of tasks might create the space for temporary collective competition.26 The motivation of participating in collaborative competition — based on a sense of belonging and for personal goals — was cited by Smithsonian Transcription Center volunteers as reasons they contributed regularly to the platform.27
Participants themselves might also wish to set self-organized challenges, such as the number of contributions they can make in a week or month. Some participants like to see their contributions add up over time. If you are using an established platform, it is worth exploring any existing statistics and “leaderboard” tools to help with this task. For example, Wikidata has a variety of tools to monitor the completeness of a dataset based on suggested matches.28
Statistics, gamification, and competition
As noted above, we suggest thinking carefully about the ways gamification may overshadow the intrinsic value of the work. Some projects have integrated gamification well as a means to catalyze excitement about tasks that might otherwise seem tedious or repetitive.
Case study: purposeful gaming and the Biodiversity Heritage Library (BHL) project
In 2015, BHL and partners at the Missouri Botanical Garden, Harvard University, Cornell University, the New York Botanical Garden, and Dartmouth University undertook research and design around gamified approaches to improving access to digital texts. Through animated games developed by Tiltfactor, participants could help refine OCR and transcription outputs. The gamification included time-based play, hybrid automatic leveling, deepening complexity of tasks, in-game feedback (e.g., buzzers), and incentives including redeemable points and game-motivation (rather than altruistic motivation). During the Smorball game, a typical level presented 30-60 words for a player to correct through the logic of the animated sport. The project also evaluated an altruistic game Beanstalk during which you gained plant height as you transcribed words; there were no time constraints as players contributed to BHL collections. The two projects were, in reality, OCR correction games. While the project found limitations in gathering a critical mass of data, its exploration of purposeful gaming for review and enhancement of cultural heritage source materials opened doors to multiple motivations through a carefully designed participant experience.29
It is worth considering that for some participants, game-like features such as leaderboards can end up being disincentivizing. For example, new or occasional participants may feel that they cannot contribute as much as the few frequently contributing participants. If a project has goals that include deepening engagement with the content, themes, or experiences, gamification may draw participants away from those goals. Capturing points may instead diminish the time participants could spend becoming enraptured by the diary’s stories, characters, archaic or clever turns of phrase, and so on.
Still, there may be reasons your project may wish to integrate individual participant statistics in the interface of your project. If adopting this approach, we recommend that you do not display individual statistics prominently, and avoid a cumulative project leaderboard. FromThePage, for instance, hides leaderboards on a statistics tab for each project and it takes some effort to find them.
Rather than cultivating interest in a leaderboard, you could instead:
Communicate to participants what statistics you are representing and why this activity matters
Consider including the display of statistics for each type of task in your workflow to include people who excel at different types of work
Allow individuals to see their own statistics as part of collective contributions towards a shared goal
Consider leaderboards that capture recent activity only (e.g., for the last 7 days), giving volunteers opportunities to elect to compete each week
Participants make your project come alive. Rather than an amorphous and faceless crowd, you will find a range of interested individuals who will engage as you begin. Thinking of them as a diverse group of people, rather than a monolith of a crowd that you can serve “as one,” is important and raises complex considerations around motivation. Each person may come with different and even evolving motivations for getting involved; each with different needs for invitation and connection with your project. This diversity of motivations and needs may spur further reflection on the demographics of your participants and how that may affect the way they participate in your project.
This chapter aimed to paint a picture of the range of motivations that may bring people in and keep them engaged — some are focused more on tasks, some on connecting to the content itself or to the community, some on the feeling of having an impact. In this chapter, we highlighted ways in which these motivations could be curated and recognized.
Following this chapter, you could go on to explore the “Supporting participants” chapter to see how to implement a user journey that is sensitive to people’s motivations or perhaps explore the chapter on “Choosing tasks and workflows,” also bearing in mind what motivates people.
I believe one of the very first successful and gamified crowdsourcing OCR-refinement initiatives was the Finnish digitalkoot which probably deserves a case study https://www.researchgate.net/publication/221604727_Digitalkoot_Making_Old_Archives_Accessible_Using_Crowdsourcing
Thanks Vahur! I love that project. I have a feeling I saw a presentation on it at DH2012 from the National Library of Finland to supplement that, but the DH2012 abstracts have disappeared from the web…
I definitely agree with this statement. Also, the ability to link to that contribution tracking is important. It enables volunteers to obtain acknowledgement and credit for their work. They are then able to add it to their CV or report back to their place of work about the volunteering undertaken. I know for example volunteers at BionomiaTracker a crowdsourcing platform linking specimen data to collectors of those specimens are able to link their “scribe” profile on that site to their ORCID id (the id used to log into the site) thus getting credit for their work.
This is an interesting point. I know I link my ORCID id to my crowdsourcing projects, thus ensuring I get acknowledgement for my work. Designing projects to enable this and other such linking to happen is a practical way of ensuring participants get acknowledgement.
So agree with this statement! I also agree with the rest of this paragraph where Austin expands on the trade off that may be required. I know from my own experience that the off putting nature of having to create an account can be partially mitigated by giving me the ability to create an account with a pseudonym.
Very good point about pseudonyms!
S Transcription Center also has a “My Work” section that includes a detailed list of the total number of pages and projects each individual user has transcribed by date and time. These reports are also available as a downloadable PDF on Smithsonian letterhead with a further explanation of the importance of their contributions.
!!!! :) We of course also have additional feedback since 2016 on volunteer motivations and interests are working with the Smithsonian Office of Audience Research (SOAR) to conduct a comprehensive volunteer survey in the Transcription Center (although this will probably not be sent out to the public until winter 2021); but happy to share any additional info/provide further examples if you all are interested in more case studies.
In response to a flood of demand for documentation from volunteering students during the pandemic, By the People prioritized development of a self-service downloadable official letter of service, as well as other enhancements to our profile page’s documentation of contributions. Happy to write up something about this if it’d be a useful example to include. Some more info at the end of this blogpost - https://blogs.loc.gov/thesignal/2021/07/making-room-in-the-crowd-library-teleworkers-transcribing-in-extraordinary-times/
That’s really generous, thanks Lauren!
I’d add “and honor the women who fought for those rights by improving the discoverability of these collections”, which was/is central to our outreach. Additionally context that might be of interest is that media picked up on this call leading to a surge in participation. To meet demand for suffrage materials BTP added 2 additional suffrage collections in the 2 years following (2 years on 120,000 transcriptions had been completed)
Thanks so much for that addition. (One day I’d love to see some analysis of what kinds of messaging is more likely to be picked up in media stories - it’s probably Journalism 101 in terms of what’s easier to pitch - but there’s probably an art to successfully balancing that with the need to acknowledge previous work and scope)
I’m not sure it’s mentioned before this so “By the People crowdsourced transcription platform” would give fuller context
*thumbs up* emoji - good catch, Lauren!
When the suffrage topic launched it included 5 collections, but has since grown. Suggested edit “that initially united 5 disparate but related collections of suffragists’ personal papers”
I came across this after our sprint but it might be a good reference for participant rewards, e.g. ‘The volunteers are invited to ETH Zurich at least once a year and receive awards in public for their contributions to our catalogues (ETH Library, 2019c). Six of them were thanked by publishing video interviews (Figure 2) in which they explain (in German) why they participate in our campaigns (ETH Library, 2019d).’ Wiederkehr, S., 2019. Open data for the crowd: an account of citizen science at ETH Library. LIBER Quarterly, 29(1), pp.1–10. DOI: http://doi.org/10.18352/lq.10294