by Mia Ridge, Samantha Blickhan, Meghan Ferriter, Austin Mast, Ben Brumfield, Brendon Wilkins, Daria Cybulska, Denise Burgher, Jim Casey, Kurt Luther, Michael Haley Goldman, Nick White, Pip Willcox, Sara Carlstead Brumfield, Sonya J. Coleman, and Ylva Berglund Prytz
Published onApr 29, 2021
9. Supporting participants
This chapter helps you plan and deliver projects that are mutually beneficial for all parties in a crowdsourcing project including your project team, participants, and others you encounter along the way who bring their experiences, skills, and feedback to your project. Building on the “Understanding and connecting to participant motivations” chapter, it focuses on how you can invite participation and facilitate engagement throughout a project. It discusses common questions from participants, working with students, creating spaces for discussion and connection, and introduces the 5 Cs — care, convening, connection, communication, continuation — for supporting participants.
In this chapter, we advocate for human-centered approaches, issuing mindful invitations to participate, and foregrounding inclusion. The values expressed through your project will influence the culture of your project. This, in turn, will affect how people behave within the project, and also who feels included. When thinking about participants, we invite you to consider values such as human-centeredness, and the importance of access, accessibility, and inclusion (for more information, please see the “Why work with crowdsourcing in cultural heritage?” chapter).
This chapter will be particularly useful for those taking on the role of community coordinator. Additional skills and responsibilities are discussed further in the chapter on “Designing cultural heritage crowdsourcing projects.”
Designing an inclusive and rewarding participant journey starts with considering participants’ motivations and needs. What are they? To quote some suggestions from our volunteer survey:
“A really good interface”
“Friendly and welcoming community of volunteers”
“Regular updates from the researchers”1
Designing and implementing a participant journey
When you undertake a crowdsourcing project, you have the opportunity to set conditions for your success. In the “Understanding and connecting to participant motivations” chapter, you learned that participants present a range of motivations and that those motivations may change throughout their participation. With those motivations and needs in mind, what can you do to craft meaningful entry points and connections with your project, sustain the participation of some contributors, and also meet your goals?
Throughout this book, we advocate for engagement and connection as metrics for success alongside the amount and relevance of data generated. Indeed, as machine learning-led methods improve and the need for organizations to demonstrate their relevance in cases, levels of engagement and connection with people are core goals in some projects. In addition to connecting the needs of your participants with your technical project design, carefully developing practices and resources might allow you to achieve additional, unanticipated benefits. You can create vibrant life-long relationships, transformative moments of engagement with new worlds, environments that sustain the recurring exchange of information, and even sets of practices that help people reimagine the ways they interact with each other.
In this way, paying close attention to the needs, expectations, and desires of the participants, contributors and their relevant communities is key. Core considerations for this area include accountability and power, ethics, and a focus on broad definitions of success.
We invite you, through reflecting on your project’s values, to consider how you can respect the value, integrity, and safety of your participants. This can include making sure your project includes informed consent and clear conditions of participation, processes to protect privacy or communicating circumstances in which participants cannot be anonymous, considering safety, and reducing harm to participants connected to your project.2 Explore these human-centered considerations further in the “Identifying, aligning, and enacting values in your project” and “Designing cultural heritage crowdsourcing projects” chapters.
Further, valuing your participants means focusing on an equitable exchange — working with participants in deeply respectful and mutually beneficial ways, appreciating that participants are not just there to generate content for you, and will be seeking their own forms of reward in line with their initial and continued motivations. Thinking about how the project can give back to participants is quite important here.
Inclusion is a process and present in the decisions you make throughout your project, from technical design to communication to outreach. We encourage you to be explicitly inclusive as you begin to invite participation in your project.
Create an effective call to action
To invite participation in your project, you should be prepared to explain in simple terms what you are trying to accomplish and the ways people will be able to contribute to that goal. Creating a call to action that succinctly expresses the why, what, how, and (potential) who of your project will help you develop the core of your messaging to potential participants. In short, what difference does your project make in the world?
This call to action can be a central component of your outreach plan. You may find it requires iteration and refining messages to find the best fit. For example, the initial call to action for the By the People3 project at the Library of Congress described it as a crowdsourcing project to help make Library of Congress collections more accessible and useful. However, feedback from potential participants at the National Book Festival and the American Libraries Association annual meeting helped reframe the call to action. The project team found replacing “crowdsourcing” with “volunteering” made the invitation resonate with more people.
Your message should be captured in simple terms and can include the rationale or justification for your work, why it is important to your targeted group/subject and associated communities, and why it matters within a broader human context.
Case study: providing hooks and excuses to celebrate with mini-projects
Potential participants encountering a project for the first time may need help finding ways into a project. One way to provide lots of “hooks” to attract interest is to break larger collections into “mini-projects.” 4 Mini-projects could be based on single notebooks or archival boxes, bound volumes or photographic albums, and sub-collections. Listing or showing the kinds of themes, tasks, collections, regions or periods to be found within a mini-project can provide “hooks” to attract interest. Small projects, ranging from tens to hundreds of pages, allow more rapid progress to be made, providing opportunities to celebrate the completion of each mini-project with contributors.
At launch, the Smithsonian Transcription Center5 provided many hooks for a participant to find content that they might be interested in, including themes (such as “Civil War Era” or “Field Book Project”), source organizations (specific museums or archives). Notes from Nature6 also divides their material into smaller batches as it gives them more opportunities to “celebrate the success of completion.”7 These examples inspired a similar model based on volumes of playbills for In the Spotlight.
Touchpoints are the places at which people will connect with you, for example, your project website, project interfaces, webinars, social media, in-person events, within their communities (of practice or otherwise).
Participants may join your project in many ways: virtually, in-person, at virtual and physical events. They may also connect via different touchpoints over time. You might consider how your project would benefit from enabling different types, depths, and frequency of engagement — then design touchpoints accordingly.
Establish an outreach plan forparticipants and communities
From our experience, participants may not immediately flock to your project as soon as you set it up — the adage of “if you build it, they will come” is rarely true, especially if you want to engage with diverse and underrepresented crowds. However, many people may be interested to connect to your project if they knew it existed.
You need an outreach plan. Think broadly first, embracing principles of inclusivity. You will already have defined your project goals and a call to action. What would people gain from contributing to your project? Even if your project is seeking subject matter expertise and assistance, you will benefit from reflecting on the range of individuals and communities that intersect with that expertise.
You can start simply. Using your core statement or call to action, create a flyer, Instagram/Twitter account or posts, or a simple web page that explains who you and your team members are, what you are doing and why. This online platform will become the place others turn to for information, clarity, and support.
Your outreach (or marketing strategy) should target your intended or relevant communities. It is very important to try and reach spaces where your potential participants already are, rather than expecting them to come to you. Working within existing relationships and community structures could be a good way to connect with people. This approach can also mean working with partner organizations that may have direct connections to the people you want to engage. This process will facilitate your ability to “find a crowd,” pockets of people who are interested in your work/share your mission/vision, and will sign in in numbers to participate in your project. More information on building community partnerships can be found in the “Connecting with communities” chapter.
Going deeper, you could also actively reach out and invite people with appropriate expertise into the project. These people can be members of your advisory board, content experts, metadata experts, research experts, and or community members. For example, community members may not have readily legible skills in a digital environment and will need to be supported to share the wealth of information and expertise they have in many other areas.
Gotcha: supporting mobile and tablet devices
If you plan to promote your project on social media, make sure that as much of it as possible works well on smaller devices such as mobile phones and tablets. It might not be possible for people to complete tasks that require larger screens — for example, transcription tasks that require an on-screen keyboard and a larger image of the original item — so you should consider how to encourage mobile users to access your project on other devices when it is more convenient for them.
Crowdsourcing with students: issues to consider
One interesting model of getting participants for your project is to work with students in a formal education setting. We would like to share some reflections of things to bear in mind in working with students.
One method to encourage students to be receptive to a crowdsourcing project is to offer deep context with relevant contemporary connections. This explicit framing can best be accomplished in partnership with either the school district and or classroom teachers through curriculum work. Curriculums can be created, adapted, and or co-created with partners. Curriculum work can also take place in just one teacher’s classroom who, through a personal relationship or shared interest, may reach out to the project and express interest in having their students participate in your transcribathon or editathon.
As has been discussed previously in this chapter and in the “Identifying, aligning, and enacting values in your project” chapter, crowdsourcing offers project teams and participants the opportunity to engage in a radically democratizing experience. Students — who are often knowledge consumers — instead become knowledge producers. Youth, who are often recipients of knowledge, instead become the shapers and builders of knowledge and access.
If the crowdsourcing project is folded into a curriculum, it increases the possibility that students will have fewer barriers and more incentives to participate as transcribers and or editors. As a part of the curriculum, students will know that they are being graded for their work. Crowdsourcing will be part of their general learning activities and experienced in familiar classroom settings along with their peers. The curriculum should include explicit information and training on primary and secondary sources, archives, access, and the principles of crowdsourcing. This provides the broader context which will help students understand the scope of the crowdsourcing project and their labor.
There are important trade-offs and power dynamics to consider when educators crowdsource cultural heritage work to students.8 Students may not be able to decline the assignment without hurting their grades, so the educator needs to ensure the activity provides real educational benefits to the students, not just labor for their scholarship. Further, the dual goals of learning and productivity are often in tension. Learning often means increasing participants’ cognitive effort, while productivity (i.e., completing tasks efficiently) seeks to reduce cognitive effort.9
The utmost care must be taken when approaching challenging topics. If the subject matter is challenging — dealing with issues of racism, poverty, violence, objectification and or subjects that touch on or are associated with race, gender, ethnicity, and other areas of identity — then students, parents, teachers, and administrators must be explicitly prepared and supported to process this information. In addition to establishing your project on a set of explicit values, create the space and guidelines for clear discussions, send letters home to parents, and include a trigger warning in lesson plans where necessary. Please be attentive to the possibility that you may need to change your plans in response to the ways your participants and or organizations respond to your work.
Case study: crowdsourcing in the classroom, student learning in the History Unfolded project
The History Unfolded (HUf) project at the United States Holocaust Memorial Museum10 started with the premise that a crowdsourced research project creates a great environment for learning history. HUf asked high school students — a primary audience for the project — as well as the general public to seek out and submit newspaper coverage of what we now call the Holocaust as it was reported in their local newspapers during the 1930s and 1940s. During the process, we proposed that students would not only learn about Holocaust history in general but also consider questions about what was known by average Americans in their own communities and what, if anything, Americans tried to do to combat genocide in Europe. Not coincidentally, these learning objectives matched the goals of the USHMM’s Americans and the Holocaust initiative.
While we were excited to see the project attracted a lot of interest and produced a lot of content — close to 5,000 participants submitting over 45,000 articles from all 50 states — we were most interested in the ways students were learning and reflecting on Holocaust history and how it connected to their own home towns. In anecdotal and evaluative responses, we could see clear evidence of how the project had an impact on student thinking.11
During the project, students questioned their assumptions that Americans simply didn’t know what was going on during the Holocaust with statements like: “Prior to working on this project, I always just assumed that Americans were pretty unaware of the events going on in Germany...I just thought it was interesting that Americans..., a lot of them knew what had happened and it was just often overshadowed by events in their hometown…”12 Students participating in the research showed not just improved understanding of the history but were able to tie the history to their own, local history.
Students also made connections between the historical events of the Holocaust and current events. Though not an explicit part of the project, students applied their new historical perspective to how they understood modern debates about refugees and interventions. One group of students used their experience on HUf to develop their own crowdsourcing project to investigate how Americans have responded to more recent genocides such as Rwanda.
Finally, students became more interested in doing the research and historical research in general as they participated in the project — often expressing sentiments like: “I thought it was fun to look through the newspapers. It was a good way to use my time because I don’t really do anything else except sleep and eat. It was a good change of pace and a good way to use my time, and thinking, and doing something....(Yes) sometimes I did it on my own time.”13 Though not one of the project’s explicit goals, students expressed an increased interest in historical research and methods.
To create a learning environment, the team established clear values for the student experience: authentic work for students (this is real research), attributing contributions (including articles used in a new exhibition), every interaction is a learning opportunity (for us and for the students). Perhaps most important was our belief that History Unfolded could bring a meaningful encounter of Holocaust history directly into classrooms through the practice of historical research itself.
Planning to support your participants is as important as planning to engage them. This section describes how you can make the most of the invitations you have extended to participants. The 5C framework is an option you could use to consider all areas of participants’ support: care, convening, connection, communication, continuation.
Here are some questions you can ask yourself about the environment and conditions you have set for your participants, along with the 5C framing:
In what way have you created choice and agency in your project’s tasks, workflows, and opportunities for feedback and learning?
Have you considered the ways your project may prompt so much activity that it may exhaust participants or your calls to action could result in volunteer fatigue?
Is there a risk that the content or narratives of the collections might (re) traumatize participants?
How will you integrate current events and the ways they may affect participants?
How will you ensure the safety of your participants? This may not feel immediately relevant, but it is important to consider how participating in a project may put someone at risk (e.g., via security threats in countries where freedom of speech is controlled), and whether you can mitigate it.14
Where will you convene activity and guide activities as you bring your participants together?
How can the rules of convening be governed in a way that creates a safe, welcoming, and inclusive space?
How can you guide participants through your project in a way that provides a good user experience?
How will you create entry points for participants at different skill levels and experiences?
Can you put aside the time to understand the motivations of your (potential) participants, and consider how to meet their expectations?
How can you enable participant-to-participant interaction?
How do you plan to acknowledge participants’ contributions?
How will you communicate with participants during and after a project?
How will you moderate public spaces of your projects, e.g., discussion boards?
How do you keep your participants engaged over time?
How can you contextualize historical materials for better understanding by participants?
How can you design an environment for respectful and mutually beneficial interaction, establishing opportunities for regular discussion and feedback?
How will you create paths for participants to skill up, find new opportunities, and apply their skills in different contexts?
How will you gracefully wrap up a project when the time comes to close it?
These are questions for you to consider when you design your tasks and assess the quality of your participant experience and flow through your project. Below we focus on more concrete and immediate ways in which you can support your participants.
Guiding participant activity
Participants are usually volunteers, so to motivate them to join and stay with a project, you need to strike a balance between training them enough to give them confidence while giving them productive meaningful work as soon as possible.15 Ideally, you will have run usability tests to identify and reduce barriers to entry and made inspiring design choices that minimize the need for additional training, but some specialist tasks require additional support. What should that training look like? When should you do it?
Because this is often volunteer work, many people are intrinsically motivated to teach themselves. You should provide formal training material such as help pages and tutorials, but also encourage them to be comfortable with “getting it wrong” (if your platform makes this possible). If your platform allows volunteers to return to previous work and correct it based on how much they have learned, let them know. Similarly, signposting early on if participants are unable to go back and correct previous work will help to avoid confusion.
Assessing the participant skill level
To ensure participants can successfully complete tasks and have a good experience, you want to make sure to match them with tasks that fit their current skill level. If a participant is new to the project, you might want to ensure they have completed some basic training before they complete tasks — otherwise, their work quality might be poor, they might find the experience frustrating, or both. Likewise, more experienced participants might be routed to more advanced tasks that better fit their abilities and interests. But how can you determine an individual participant’s skill level?
One approach is to rely on platform features such as the date the participant’s user account was created, or the number of contributions in their account history so far. While this is a relatively fast and easy way to assess participant skills, it is also relatively unreliable. Some participants might not have created an account for their early contributions, or might have created a new account after misplacing a password or some other issue.
Another approach is to directly assess participant skill level with a quiz. Project teams may want to create an interactive quiz that tests the key skills necessary to perform well on tasks. This method also brings some caveats, as it can be hard to effectively design a quiz that quickly and effectively assesses the proper skills, and gatekeeping can be a frustrating experience for participants if they feel they are being unfairly excluded.
A third approach is simply to ask the participant about their skill level, or what level of the task they would like to work on. This approach relies on participants to accurately assess their own skill level — a potentially difficult proposition, especially if they are unfamiliar with your project or tasks — but provides maximum flexibility for the participant.
Creatingyour supporting and training material
You should start drafting help and training materials as you design the project, update it intensively during usability testing, and in response to comments after launch. Ideally, you should also review it annually, to allow for changes in the project and wider context over time.
Once you think your project is ready, you should test it by creating example tasks. By working through your example tasks, you can consider the questions you have, resolve them in a standard way for the project, and take screenshots for your documentation. This will also allow you to check that any data created will be in a usable format. You can also refer participants to your examples throughout your documentation.
We recommend you do one or more usability tests after your project is configured but before you launch publicly. You can create a limited release prototype and test it with a handful of coworkers or very involved stakeholders who are both less familiar with the task than you are and less technically savvy. Alternatively, you could invite them to a short session — in person is often easier, but online works too — and ask them to complete the task, “thinking aloud” as they go.16 Write down every question they ask, watch what they struggle with, and consider what they do and do not ask about. Fold what you have learned into improving the project. This type of testing should involve “friendly” volunteers. These may be participants who already volunteer for your institution in another capacity, or might be from a group you have a relationship with. In addition to making sure your task and documentation are clear, you can record your training for these participants and use it for “on-demand” online training.
Types of training/documentation that you may want to consider building include:
Short contextual help available on the task page
Frequently Asked Questions (FAQs), ideally based on questions asked during usability tests and updated in response to new questions
Examples of completed tasks and material, ideally in the platform so participants can review them “in situ”
Tutorials and video or slideshow-style walkthroughs. These help potential participants get a sense of what kinds of skills or knowledge a task involves and are extensively used on Zooniverse17
Printable guides with screenshots. Some participants enjoy having a guide they can print and keep next to their computer
Live and recorded training sessions
Documentation about transcription conventions and other information participants will need to know, tailored to the task and collection items at hand18
Information about how the results will be used, to help participants make decisions in undocumented situations
When to train participants
There are a handful of times that training can occur.
When a participant joins a project
Thinking about training needs is a key part of onboarding new volunteers. An in-person event is a very effective way to get participants past the uncomfortable feelings we all have when we encounter something new and different (and crowdsourcing tasks, materials, and software are all new to many volunteers). Online events can also be effective at introducing and training folks. A welcome email that explains the purpose of the project and how to contact the project leader may also be useful.
When a participant runs into a hard or confusing task
You will not usually know when that is, but when they have a question about a task they will look for an answer (often not before!). In this case, the participant may look at contextual help on the page, examples, and reference material you supply in an introductory email or on the initial page for the project. They may also engage with others in the community or the project leader through notes or forums that point back to the exact context for the problem, email, or live chat windows (with clearly articulated “office hours”).
We recommend considering one or more of these routes when designing training spaces for your project.
When to consider alternatives to voluntary crowdsourcing
Just because you can ask your participants to do things for your project does not mean that you always should. For some tasks and some audiences, asking for free labor from your crowd raises questions around equity, inclusion, and even simple practicality that should be considered. These complex situations often defy one-size-fits-all solutions and need to be worked out according to the situation of each individual project. For some tasks, other models beyond crowdsourcing — contracting the work, paid internships, etc. — should be considered.
Consider the communities you work with when planning a task. Many people from underrepresented communities experience frequent demands on their time, and expecting free contributions from underrepresented groups may raise questions of equity and respect. The Wikimedia movement, for example, is currently debating the issue of how to recognize the lack of equity built into its system of volunteer contributions.19 Similarly, the concept of volunteering is not equally recognized across the globe so not offering payment could limit the diversity of participation.20
Asking student participants to contribute to some crowdsourcing projects also raises questions, as discussed above. Not only must projects work to avoid taking advantage of student labor, consideration should be given to who else could be benefiting from their work. For example, in the case of Wikipedia, unpaid student labor editing articles can be seen as contributing revenue for search engines, such as Google, which draw on Wikipedia for their results. How you respond to issues of fairness and equity will largely depend on the platform and specifics of your project, but you should be able to answer questions around what type of benefits participants will receive from your project (such as access to resulting data, acknowledgment, etc.).
In some fields, like computer science, it is the norm to assign some types of tasks to paid services — including paid crowdsourcing platforms such as Amazon’s Mechanical Turk.21 Academic projects that require ground-truth datasets for machine learning often rely on paid services. As in the case study below, paid platforms can be used to prototype bespoke platforms or new, untested workflows and identify social interactions that may be present in future platforms or communities. As with all project choices, the values of your project and community should guide your decisions.
Case study: prototyping social interactions with paid crowds in Second Opinion
Second Opinion22 is a software platform developed at Virginia Tech to support last-mile person identification in historical photos. Second Opinion builds on Civil War Photo Sleuth (CWPS),23 a website that combines crowdsourcing and AI-based face recognition to identify unknown people in American Civil War-era photos. In this context, “last-mile” means that face recognition has suggested several high-probability matches for the mystery photo, and the user has to select the correct one among them — often, a challenging task. To get help with this task, users of Civil War Photo Sleuth often asked the community for a “second opinion” on their decisions. We built the Second Opinion software tool to support these requests.
A key challenge in building Second Opinion was that, because it was collaborative software, testing required a small group of people. Every time we tweaked a workflow or user interface design, we needed to recruit not only a CWPS user but also a group of six other people providing their second opinions. Although the CWPS community is full of generous volunteers, we knew we could only ask them for help so many times. Since the software was still under development, it was glitchy, and volunteers might get frustrated. Instead, we opted to hire paid workers on the Amazon Mechanical Turk (MTurk) crowdsourcing platform. These workers would be paid regardless of whether the software worked or not, so they did not find bugs as frustrating. And because MTurk hosts a large labor pool, we could run many tests with different users without exhausting their patience. Once we ironed out the bugs and found a collaboration model that worked, we began integrating these features into the main website where volunteers could begin to test and use them. In this way, paid crowds on MTurk allowed us to prototype social interactions in a way that benefited them financially, while respecting the valuable time offered by our volunteer community.
Toolkit: common questions from participants
Answers may vary by project. We have provided some example responses below.
1. What if I can’t do the crowdsourcing task? Try breaking it down into smaller, more manageable pieces. You will be able to do more than you think! Give it a try — it will get easier with practice.
2. What if I make a mistake? Will I look stupid? Crowdsourcing does not require you to be perfect, only to do what you can. The next person will pick up where you left off and catch mistakes. It is okay to make mistakes; your skills and knowledge will improve with practice.
3. Why can’t a computer do this? Computers are only as “smart” as we can program them to be. Artificial intelligence cannot (yet) understand what the human eye and mind can. You can use contextual clues to make informed decisions, in ways that computers cannot.
4. Where do I start working? Did it save my contribution? These questions suggest that your usability or documentation needs some improvement.
5. Who will check my work? Your fellow participants will check your contributions and you can help by checking theirs. Yes, staff and other advanced users will review your work (with the option to become an advanced user over time).
6. How do you know the results are going to be accurate? Crowdsourcing has been shown to produce fairly accurate and reliable results. Review processes help verify the accuracy of the results. Even if the results are not 100% accurate, they are usually an improvement. Introduce the concept of iteration and that we may collectively move closer to completion or accuracy but never fully arrive, and that is okay.
7. Aren’t you just asking people for free labor? We have invested in the project and here is how we have included values in our project structure. All efforts of the crowd benefit them, too. Results are freely shared.
8. What about vandalism? How is my work protected? We can always revert to a previous version of the work so that nothing is lost. Point to governance and policies that address how vandals will be treated.
Creating space for communities to grow
In the course of developing a crowdsourcing project, it is important to create spaces for participants to connect with the project beyond the crowdsourcing tasks, should they wish to. Invitations to participate in a project need to welcome participants as whole people, brimming with curiosity, questions, and an eagerness to learn more about a subject. Sometimes, those needs can be met through project scaffolding documents (including tutorials, welcome pages, and FAQs). Relying on written documents alone, however, can be a missed opportunity to build more robust communities around a project. A strong sense of the community can drive participation and foster a deeper connection to the importance of a participant’s contributions. The desire for communities to grow refers to both individual participants and the number of communities that can form in these affinity spaces.
The move to create spaces for people to connect with a project can take a variety of forms and can be understood under three rough headings.
First, some projects offer the opportunity for participants and organizers to communicate directly with each other. Platforms or project teams may create this space through message boards, where contributors can post questions or share findings (for example, Zooniverse’s Talk Boards, the American National Archives’ History Hub,24 self-organized WikiProject,25 or a project hashtag on Twitter26). Second, some projects generate momentum and points of contact by creating campaigns and time-bound challenges that take place over a matter of days, weeks, or months. Third, and perhaps most popularly, some projects hold events that gather people together (in person or online) for simultaneous work on a given project. For example, participating in campaigns and group challenges around specific collections or tasks can help to create a sense of community. You can find out more about events linked to crowdsourcing in the “Planning crowdsourcing events” chapter.
No matter what approach to community outreach you choose, the goal remains the same: to create a way for people to gain a sense of shared purpose and increased enjoyment in a project. The investments of time and effort involved can have direct results and attract impressive rates of contribution. More importantly, creating spaces to connect will allow your participants to share their insights or perspectives with others while sharing them with the project team. We find that participants can often make the most powerful cases for the value of a given collection, project, or collective effort. Let them be our best ambassadors!
Discussion boards and hashtags
For many participants, the opportunity to talk with each other and the project organizers about their shared work is an important ingredient in their satisfaction with a project. Some projects use platforms that build discussion boards directly into the interface for the crowd-work itself. These boards allow for participants to ask questions, get support on problems they encounter during the work, and share their reflections and ideas about the subject matter of the project. The discussions can become passionate and detailed opportunities for everyone involved to share their enthusiasm, interest, and knowledge. These boards are where individual contributors can get to know each other and form a community with shared goals.
Maintaining these discussions involves considerable work and skill for the project team. Without prompt and thoughtful responses, the discussions will not develop. Feedback needs to be sensitive and supportive. Project organizers also need to set ground rules for the boards, to ensure they stay civil and productive, and be ready to step into any conflict.
Some projects use existing social media platforms to supplement or replace built-in discussion boards. Using groups and suggested hashtags to identify posts about the project incorporates the discussion into platforms participants are already using. Whatever tools are used for direct communication with and between participants, cultivating the community requires a real investment of time and skills development for project organizers.
Perspective: community spaces as sources of spontaneous discovery — Samantha Blickhan, Zooniverse Humanities Lead
When we are onboarding new teams to the Zooniverse platform, they don’t always see the Talk (message) boards as the most exciting part of their crowdsourcing project. Before projects launch, we spend a lot of time encouraging them to be present on Talk, but it isn’t usually until after a project has launched that the team realizes how transformative it can actually be, and we get wonderful messages with examples of participants engaging more deeply with source materials than anticipated, or exciting discoveries people have made that often have little to do with the primary aims of the project (in terms of tasks and/or workflows). There are two stories I particularly love to share about discoveries made on Talk platforms. First, the Green Peas — an entirely new class of galaxy discovered by volunteers on the Galaxy Zoo project (including their use of “Give peas a chance” as a rallying cry).27 Second, the antedating of the term “white lie” by a volunteer on the Shakespeare’s World project, which showed that the term had been in use almost two hundred years earlier than previously thought!28 While these are wildly different types of discoveries, the connection between them is that in both cases, project teams were actively communicating with — and listening to — their participants on Talk. If these teams hadn’t been using these spaces to interact with their respective project communities, these important discoveries may have gone unnoticed.
We close with a quote from Siobhan Leachman, a prolific, experienced, and highly motivated volunteer who has contributed to many of our projects. In this lightly edited quote from her response to our survey,29 Siobhan makes many of the points we discussed throughout the chapter, and indeed the entire book:
“First, the volunteers should have the ability to freely reuse not just the content being generated by the crowdsourcing project, but also the digital surrogates being worked on by volunteers during that project. To be TRULY great the crowdsourcing project documentation and the software the crowdsourcing runs on should also be openly licensed for reuse. This would enable other crowdsourcing projects to be more efficiently established and encourage the adaptation and improvement of crowdsourcing project documentation.
Second, the project must contribute to the public good by enhancing knowledge. To be great, this “Why” for the existence of the project should be succinctly communicated to volunteers prior to their commencing work. It should underpin and be the guiding light for the project. This “Why” should also give volunteers clear information about how the project will shepherd the resulting crowdsourced knowledge through to the appropriate organisations or researchers.
Third, the crowdsourcing project should actively encourage and enable volunteers to serendipitously explore the information and knowledge to which they’ve been led by participating in the crowdsourcing project. This exploration might take the form of asking further questions or raising issues with project organizers or it may take the form of more external engagement, such as volunteer-initiated research into topics they have been inspired to investigate. This deeper engagement by volunteers, although possibly reducing the volunteers’ time assisting the project to achieve its first goal, its “why” for existing, is in my eyes an equally valid if not even more important outcome of a crowdsourcing project. A great project offers up and encourages this serendipitous engagement as a reward to volunteers who participate.
Fourthly, a great project supports and encourages engagement and contact not just between the volunteers and the folk running the project but also actively supports and encourages engagement between its volunteers. The social aspect of a crowdsourcing project is yet another of the intangible rewards offered to volunteers who give their free time and effort to participate in a crowdsourcing project.
Finally, a great crowdsourcing project should have an exit strategy established right from its creation. To be great, the project will have a plan for what it will do when it is required to wind up the project. The project should ensure that the volunteer work product is not lost after the project has been wound up. This may entail digital archiving of the volunteer work product at the hosting institution and external repositories and saving the project website URL to the Internet Archive, etc. To be great, the project should also have a plan of what to do with the volunteers once the project has been wound up, e.g., suggesting a similar crowdsourcing project to participate in, sending a letter of thanks explaining the progress and intended use of the volunteer contributions, etc.”