Hints and tips for crowdsourcing: A handbook from PriceCheck

Filed Under: Costs, Patients

© 2013 Ron Cogswell, Flickr | CC-BY | via Wylio
© 2013 Ron Cogswell, Flickr | CC-BY | via Wylio

Summary: We are often invited to share our expertise in crowdsourcing, as I did as a co-author of a recent “How-To Guide to Crowdsourcing” for the Tow Center for Digital Journalism at Columbia University. While there is, strictly speaking, no rulebook for how to do this, there is an evolving community of practice and some shared ideas. Among the great places to check out current practices in crowdsourcing is the Crowd-Powered News Network that was founded by Amanda Zamora and her colleagues at ProPublica. In the process of putting together our crowdsourcing projects, I’ve come up with a few ideas that I’m putting out here as a resource. Others are sharing theirs, too, on the CPNN group and elsewhere.

 

 


Here are a handful of things we try to think about as we’re crowdsourcing health prices.

We think of this as journalism that is imbued with humility. As Jim Schachter, vice president for news at WNYC public radio, said in an interview for the Tow Center report: “That’s what it’s about. It’s a genuine expression of humility that the audience, however you’re defining it for a particular endeavor, knows more than you do — and it’s to be listened to. That’s really important.”

Planning a project

Listen carefully. Listen to your community, and know what they care about.

What’s the problem you’re trying to solve? What question or questions are you answering? Define your goals clearly and decide how you’ll measure success — clicks, Facebook likes, earned media mentions can count just as much as shares from your community. Caution: If the problem you’re trying to solve is that you don’t want to pay contributors, think again.

Explain your goal/mission. Build something that has you actually joining hands with people, creating something that’s greater than the sum of its parts, not asking them for free labor.

Explain to your community what they’ll get. For example, in ours, it’s health care pricing info, and the ability to compare their prices with others’. In the WNYC sleep project, they got to compare their sleep patterns with others’.

Understand your contributors and their motivations.

How can you best structure? It’s on you: who to ask, how to ask, where to ask, what to ask. We tell our community what we are doing, and how we want them to contribute so that the results will be of the most use to them and us.

The more effort and energy you put into the project, the better your results will be. If you make an open call and walk away, your results will be diminished. If you don’t have a conversation with your community, they’ll know you’re not engaged.

We give the community data to start out with, so they knew we had put in effort, and so they had something to work with. We then ask for their help in adding to our knowledge. We chose the PriceCheck methodology carefully: understand the problem, then put a bunch of work into it and ask the community to help. We also chose the wording carefully: “A community-created guide to health costs.” In other words, rather than putting a form on the site saying “send us your stuff,” we made a database and asked people to add to it.

Little things matter. Our buttons say “Share,” not “Submit.” We made the data in PriceCheck immediately visible upon search by default, though people told us that was a terrible mistake. It turned out to be fine with our public media partners’ audiences, though a potential partner that is a big digital media company told us “the trolls will be out for this one,” so moderating was advisable, in fact mandatory, in their environment, and visible-by-default would fail. This kind of design decision will go a long way toward predisposing your project to success.

Ask questions that have answers. Of course free-form crowdsourcing can be fun, but if it’s something easy — how much do you pay for your birth-control pills? — it can be easy and rewarding for people to contribute. Questions with answers let people give you stuff easily.

Reduce friction for contributing. Know a lot about what you want and how to make it easy for people to help.

Timebox if you can. The open-ended projects can fizzle. You need engagement, and that can be hard to sustain over time. Or, if you’ve developed a methodology that works, apply it to a different problem.

Building together

Show the community the contributions. Our pilot asked community members to fill out a form, and they could not then see the resulting data (theirs or anyone else’s) until we and our partners had decided what to do with it. We chose away from that when we built our improved PriceCheck tool — making openness and visibility the default.

Acknowledge and reward contributors in many ways. A thank-you e-mail, a shoutout on air, an invitation to an event.

Be ready for a flood of responses early on. But make sure that you are asking people repeatedly to contribute, reminding them, in ways large and small. If people heard about us first while they were driving the car, for example, they might not go find data and share it — but if they heard us three or four times, they’d find or make the time to help us out.

Some projects build slowly. Not all efforts work the same way.

Don’t mistake your contributors for yourself/yourselves. We asked our public radio community in SF and LA about IUD prices, but they were more interested in colonoscopy prices. That’s maybe not surprising if you stop and think about public radio demographics, a thing we maybe should have considered in the first place.

Report back to your community early and often on what the results are. At PriceCheck, our embeddable widget is available 24-7 for people to share and search, and we also write great stories with the data.

Respond to community members when they ask questions. We had a flurry of email questions from community members in the early going on one iteration of PriceCheck that we used to build a list of Frequently Asked Questions  (FAQ) about the project in the first 24 hours. That FAQ information could be sent as a link, billboarded on the project page, or cannibalized for use in response emails directly to community members.

Stay in close touch with your team. If your project needs tweaking, fix it. If it doesn’t work, fix it or end it, taking your learnings forward to the next endeavor. (And be public!) If people criticize what you’re doing, listen carefully to what they think is wrong.

How to look at your results

Work to get as much data/as many results as possible. Some of it will be partial data, or random, or an answer to a different question from the one you asked. Be open to the possibility that you will have to analyze, follow up, do some data work. Don’t expect all of the contributions to be perfect, complete answers to the question you asked, or perfect acquittals of the task you defined.

Small data can be meaningful, even more meaningful than big data. ProPublica collected 1,000 responses in its Patient Harm questionnaire; that’s not 1 million or 10 million, but if they were quality responses (and they were!) that is meaningful. What you do with the responses is also very important; here’s a thoughtful piece about why Big Data isn’t the only answer.

Remember it’s not only quantity, but also quality. Sometimes reaching a handful of people, if they’re the right people, is best — or even just one person.

Understand that people may be skeptical. “Why would you believe any of this anyway? It’s all crowdsourced.” Be ready for this objection; it comes up a lot. We usually say “we trust our community and believe in them.” And we mean it.

Be open to the possibility that the responses will lead you in a direction you didn’t expect.

Think about measuring: sometimes you think you want to measure one thing at the outset, then realize that other things are equally important.

Think carefully about verification. As you run a project, observe what the results are, and think about whether anyone is goofing on you. How can you tell? What do colleagues say? We spotted only a couple of “Astroturf” contributions over a combined 10 months of PriceCheck.

Think of engagement as a ladder, or as a number of potential actions, with sharing data being only one of the things we would like you to do: you can also read, scroll, share the item to your networks, tweet us, comment on it, search our database, email us, contribute your information (if you have it) — or  send in your explanation of benefits and bill or to appear on a radio show. You might also invite us to come testify before your State Senate committee on price transparency. Setting out those goals at the beginning and measuring them during the project is challenging, but necessary. We also found things came up that we didn’t realize that we would want to capture — one of our posts went viral and drove a lot of traffic back to the site. And we did not expect to be asked to testify before the State Senate Health Committee. This points to the question of measuring impact, and what are the best tools for that — pointing also, again, to the idea of specifying your goals before you start, and yet being open to other kinds of impact.

Little things can show you big things. When you search our database, for example, that lets us know what you’re interested in, even if you don’t share your data. So we didn’t ask that question specifically (“what are you interested in?”), but the way the project is structured, the community can tell us things that we didn’t know to ask.

Have a freeform “notes” or “comments” box to  make it easy for people to talk to you. We are asking for data, but also for impressions and narrative. Making both possible was important.

Be open! We are smarter and stronger together.

Listen carefully.

Where people doing crowdsourcing sometimes go astray

  • They expect all the information to be perfect.
  • They expect the community to be as excited as they are and to be doing nothing else.
  • They fail to listen. If you’re monitoring the responses and they’re not what you want, then fix it.
  • They don’t define the task clearly enough.
  • They want full control.
  • They don’t fully believe their communities have a lot to contribute.
  • They don’t user-test their software or their callout, even in hallway testing or cafe testing, before launching.
  • They don’t respond: They put up a form and then walk away. They don’t engage.

Why people contribute to crowdsourced projects

  • They have specialized knowledge.
  • They want to contribute to the common good.
  • They want to help newsgathering at their public radio station.
  • They want to feel like they’re part of something bigger.
  • They want recognition (but there are fewer of these than you’d think). Lots of projects have people who ask for anonymity because they specifically do not want recognition or do not want to be named.
  • They want to right a wrong.

Suggestions, questions? Let us know in the comments.

You can download our Tow Center for Digital Journalism “How-To Guide to Crowdsourcing” here.