OpenNews

A Peer Data Review experiment to bridge a support gap, not a skills gap

Some of the seeds for Peer Data Review were planted at a training day for local journalists this May.

Some of the seeds for Peer Data Review were planted at a training day for local journalists this May. (photo/Alyson McClaran for Colorado Media Project)

This fall we ran a pilot program that offered peer coaching for journalists with a data story in progress but no colleagues to backstop their work. A dozen people connected with community volunteers, so that’s 12 journalism projects moving that much closer to having an impact—something worth celebrating!

The Peer Data Review program was community-powered, informed by conversations over the past year with many journalists in local and regional newsrooms. We modeled it on the kind of coaching that already takes place informally in places like Slack DMs, email lists, and conference lunches. Not everyone knows help is out there, though, or feels able to ask for it. This program’s purpose was to make that network more visible and accessible, and to learn what could make a peer-driven support system sustainable.

We organized Peer Data Review as an experiment from the start, open for a month of responding to requests for help, then several weeks of followup and assessment. It’s so exciting to see all our hopes now showing up in program feedback: “Having an experienced person to bounce my ideas off of was so valuable—especially since I don’t have someone in my immediate workspace who knows much/more than me regarding data journalism.”

We’re counting this pilot as a huge success, with some key takeaways heading into 2020:

  • Journalists in smaller newsrooms face a support gap, not a skills gap. The roadblocks for local and regional journalists aren’t in learning the techniques of data journalism, they’re in finding someone who can doublecheck their analysis or point them in the right direction when you’re stuck. As one participant told us: “Both my manager and my editor for this story were happy that there was someone else I could chat with about things that they might not understand as well, or answer questions they may not be able to.”
  • Local coders are here for each other. We had no trouble putting together a group of volunteer coaches before the program started, and more people offered to help as soon as the pilot program began. Our coaching pool doubled in just a couple weeks.
  • If you create some light structure (then get out of the way), it doesn’t take much time to get a lot done. We created documentation for coaches to start from, and laid out clear guidelines so everyone would know what to expect from the program. Our participant form collected key project information from participants, and as requests for help came in, we matched them up with coaches based on topics, background, and tools. Between emails and one-on-one calls, it took just an hour or two to make a difference in nearly every project.
  • Part of sustainability is valuing people’s time. Many of our coaches help out colleagues often, during day jobs and in other ways. We offered a thank-you stipend to recognize the time we were asking them to commit to, and to make sure people whose schedules weren’t as flexible could still participate. This was another way we could build equity into the program from the start, and participants told us that knowing people were being compensated made it easier to ask for help. “At first I felt a little guilty taking the money as I was motivated by just wanting to help, but I think it is important for us to also feel like our skills and expertise are valued by the broader community.” We agree!
  • Knowing that coaching is available can give journalists more confidence to pursue data stories in the first place. We’re hearing this directly from people who participated in the pilot, and that’s one of the most exciting takeaways of all. As one participant put it: “It’d be great to have something like this available on an ongoing basis. Without it, I’d never have felt confident enough to look to somebody … for help as a solo data journo.”

In the next month or two, we’ll be looking to share some of the stories that were published after peer coaching. This network of news nerds shows up for each other, and we hoped that outlining a simple process and helping people connect would make that support easier to offer and easier to find. It’s exciting to see such tangible results right away.

(And if you’re curious about how we ran this experiment, read on! We’re all about working in the open and learning from what we do, so here’s how we put the program together. I hope you like bullet points, because we have bullet points.)

Our assumptions

To build something worth testing, we had to make some assumptions about how a Peer Data Review program should operate. Our experience organizing community-support programs gave us a place to start, but we also: looked at other peer-review programs with similar goals, checked in with people who lead journalism trainings, and talked with journalists who’ve asked friends in other newsrooms to look over their work. We ended up with a few principles to drive our decision-making:

  • Our coaching model should take about two hours total. We figured most projects would require trading some emails, looking over data or code, then scheduling a call to talk things through. But the process should be flexible at the discretion of each coach. Some projects might need an hour of help, others might need three or four. Maybe emails would be enough, or maybe you need a call to kick off and a followup later.
  • Stipends should be baked in from the start. We could have just asked people to volunteer as coaches, and I’m confident we’d have gotten a great response. But part of figuring out sustainability is making sure people’s time is valued.
  • We needed a wide pool of coaches before we started. When someone asked for help, we wanted to match them with a person who’s familiar with the tools they were already using. And working with someone from a similar—or a very different!—background can be a really helpful experience. We decided 8-10 coaching volunteers would be a good goal.
  • Participants needed to know their work would remain their work. No one wants to get scooped, and sharing any kind of credit should be completely optional. We also thought a lot about details that would be helpful to share with editors if reporters felt like they needed to ask permission to participate.
  • Because there were things we couldn’t know, clarity wherever possible was crucial. We were careful to communicate that this program was an experiment, but coaches needed to know what they were actually committing to, and that we had a plan to support them. Participants needed to know what kind of projects to bring to the program, what they could expect from a coach, and that we had a plan to protect their privacy.

How we ran the program

Before we could launch the pilot program, we needed coaches. And before we could invite coaches, we needed documentation to share with them.

  • We drafted text for a public-facing program page, to help potential coaches understand how we hoped data review would work—and more importantly, how we’d set expectations for participants—and inviting their feedback about the assumptions we’d made.
  • We wrote an internal coaching guide, with more detail about our goals for the pilot program, specifics on timing and stipends, and suggested patterns for coaching.

Those documents gave us a complete picture to share as we invited people to join the pilot as coaches, with room for them to respond with ideas and adjustments. In a little more than a week, we had nine volunteers, enough to feel comfortable launching publicly. And their observations about our program documentation helped us keep clarifying expectations and identify the right questions to ask participants.

Our project form would be the first part of any conversation between peer coaches and participants, so it needed to be short but detailed. We asked people to tell us about their data and what they hoped to do with it, what tools and languages they were most comfortable in, and any specific questions they already had.

And we developed an outreach plan to make sure that people who could really use Peer Data Review actually found out about it at the moment they needed it. We had so much help spreading the word through tweets, posts, and newsletters from organizations like LION, INN, API, Nieman Lab, Poynter, and Local News Lab. We shared details in journalism Slacks and email lists. (And I still feel like there are lots of journalists who’d love to have a peer look at their work but never heard about this program. We also heard from journalists who were excited about peer review but didn’t happen to be working on a data story while it was open.)

With our program page published and the outreach plan in motion, requests for help began to come in. As they did:

  • We triaged each request ourselves. Along with recruiting the initial group of coaches, this was the highest-touch part of the process on our end. Based on project topic and tools, we reached out individually to a coach or two who felt like a good fit. We recognized that these were requests for people’s time, alongside their own work demands and personal lives. If no one was available, we sent out a wider email asking who might have time and interest in the project. Most projects took just a day or two to find a coach.
  • We created a “starter kit” with additional details for each project. Each project had a Google Doc shared just between us and the coach, including everything the participant told us about their project, plus a space for notes.
  • Coaches managed all the communication. We sent participants a very short introduction email, letting them know who they’d be hearing from and pointing back to the set of expectations they were agreeing to by participating. (Not a group email, by the way, to avoid any need for the “bumping you to BCC” dance.) From there, the coach took over and organized the conversation however worked best.

In the month the form was open, we connected 12 projects with coaches. A small number of requests for help weren’t quite right for this program, and we tried to explain why and then connect those people with different resources.

After the pilot program closed, we got back in touch with coaches first. In a feedback form we asked a few questions about how things went, whether they felt like their time was valued, and what kind of questions we should ask participants about that side of the experience. We waited a few weeks to ask for participant feedback to give people time to act on the advice they got through the program (and maybe even publish what they were working on). Some of those comments are still coming in, and we’ve already learned SO MUCH from testing the assumptions we came into the program with.

posted December 17, 2019 | posted in