Green Crab Monitoring

Building Citizen Science with Volunteers as Partners: Part 3

The field of citizen science has grown explosively over the last decade, benefitting from excellent PR, Presidential shout-outs, advances in big data, and, not least, a Web 2.0 world replete with hashtags, networks, and crowdsourcing. It turns out people love doing science in their spare time! The historians remind us that, actually, what we call citizen science is not a recent invention (e.g. Audubon Christmas bird counts date back to 1900), but the current trend is certainly a transformation of the way science is done, and for volunteerism.

The Crab Team project enters this field during a period when the discipline is growing in self-awareness. As the President’s science advisor makes the call for citizen contributions to influence policy, we recognize we have a responsibility to invest thoughtfully in the design and execution of our program. Our approach has been motivated by two core goals:

  1. Generate high-quality data and information that can be used to inform policy and management; and
  2. Engage and educate volunteers, so that they can act as stewards and advocates for the marine environment.

The volunteer experience is of paramount importance to both of these goals. The volunteer is the lens situated squarely between the crab on the beach, and the number I enter into the master spreadsheet. Quality data depend on volunteers knowing the protocols, and being able to implement them comfortably, troubleshooting when necessary. Quality data also depend on consistency of volunteers, which sustains the project in time and over greater geographic ranges. Volunteers have to feel comfortable, useful, and valued, to return month to month, year to year – no one really feels like it’s worth their time to show up and do half a job!

Crab Team Protocol practice 15

The Crab Team staff members headed out to the field in 2015 to develop our survey protocols. Photo: Jeff Adams

It’s with all this in mind that we have been conducting the ongoing assessment of our pilot year discussed in the previous two posts. We are striving to make this a sustainable program, that builds a valuable long-term dataset of the health of pocket estuaries and salt marshes in Washington. As we prepared to launch our pilot season in 2015, we on the Crab Team staff did our level best to come up with a protocol that was effective (to maximize the chances that we would find green crabs as soon as they arrive), rigorous (to provide reliable data that allows comparisons among sites), and practical (so we don’t overwork our dedicated volunteers). We consulted our own experience and that of other experts in the field, but some of the most relevant feedback comes from our volunteers. Believe us, we muscled through some excruciating/exhilarating conversations about what quadrat size to use, and don’t get us started on how we selected mackerel as bait.

The evaluation project by capstone student, Natalie White, addressed both of our goals:

  1. How well do we think our volunteers are collecting data. That is, are they actually prepared to sample independently at various stages of the training process: after our initial training workshop, and after our site visit?
  2. How happy and confident do volunteers feel about different aspects of the sampling process?

We’ve raked through the results (read posts 1 and 2) and made several changes this year, some small, like the format of the data sheet, and some larger, like the way we collect information on molts.

Changes to training workshops

Results from the evaluation underscored something we had suspected: on the scale of citizen science participation, Crab Team is a relatively intensive project, and requires non-trivial investment and expertise from our volunteers. Therefore, it makes sense for us to likewise invest in training. In 2015, we planned training workshops to run for 3 hours (and, there were a couple of situations where we only had 2 or 2.5 hours to run a couple of them!). The most consistent feedback we got on our training efforts was to make them longer. This was a bit of a surprise for us, because we assumed, “shorter = better.” But it turns out that volunteers who had signed up for the trainings were already expecting to invest time in learning new techniques and information.

Crab Team Training Whidbey 16

Longer trainings mean more face time with critters.

This year, we lengthened the training workshops to 6 hours, basically a full day. This gives us a lot more opportunity to spend time on one of the most fun parts of learning: The Pocket Estuary Diversity Bonanza. Everyone is here for the critters! Sure, learning how to measuring with vernier calipers IS a blast, but playing with crab molts is way more fun. And, according to our volunteers, this is the skill with which they feel the least confident. Therefore, this was one area we felt we could add substantial value to the workshop. This year, we’ve set aside a lot more time to not only learn who the critters are (setting up interactive ID stations), but also what they do. Sean McDonald provides excellent natural history background and trivia that will undoubtedly provide cocktail party fodder with which volunteers can impress their friends! We’ll also be adding online resources over time, to enable volunteers, and anyone else, to practice and learn about the animals we see.

The longer workshops also give us the time to explain why we do what we do. We’re not into the “because we said so!” approach to science, handing down a protocol as a ***Decree From Crab Team Headquarters***. We believe that by showing why it pays to count all of the crabs, and what we can do with the data collected by the volunteers, it encourages everyone to be mindful of the details, improving both data quality and volunteer engagement, and providing some education about the science along the way.

Here is a great place to mention that we are fortunate to have an exceptionally skilled set of volunteers, many of whom have studied or practiced science as a career. Conversations about why we focus on procedural minutiae often launch discussions about benefits and drawbacks of alternative approaches, and we get to benefit from a broader set of skills and scientific perspectives. Such negotiation and iterative improvement of the methods is at the core of how science makes progress – it’s well worth the time.

Changes to our protocol

By the end of the 2015 monitoring season, we had two months of data, feedback, group observations, and surveys to help us identify what parts of our protocol might benefit from tweaking. We expected that it wasn’t going to necessarily be all roses – counting 400 crabs in a single trap is tedious, there’s no two ways about it. The mud is, well, muddy at times. And we also had experienced for ourselves that some parts of the protocol were going to be difficult to adapt to a diversity of sites – how do we meet the goal of sampling consistently at a wide variety of sites? Are all of the data we are collecting sufficiently valuable to ask our volunteers to put the effort into collecting them?

It became apparent that one of the biggest sources of confusion, admittedly for the Crab Team staff as well as volunteers, was the way that we surveyed crab molts. Where do we put the transect rope? How careful should volunteers be looking for molts? Because if you really start getting down into the pickleweed, you find a ton you didn’t know were there and you could be there for days!

306.Aug.15

Every site is different, Deer Lagoon required us to rethink how we measure the value of shoreline habitat.

The shoreline survey (including the molt survey) has been the focus of most of our protocol changes. We have separated the two components of 1) gathering information on shoreline features (how much vegetation is there, what is the composition of the wrack) and 2) looking for and counting crab molts. The first part enables us to ask questions about what type of habitat the site provides, and maybe get insight into how green crabs might or might not change the habitat if they invade. The second component, the Molt Hunt, targets early detection of green crabs, and also enables us to look at the seasonal timing of crab molting. It’s also an excellent candidate for the next event to be added to the Summer Olympics, and our volunteers will have a leg up on training. Our goals for the change were to streamline this survey and reduce unnecessary tedium, but still maintain the ability to compare sites to each other – even if they are very different. We expect there will still be a learning curve, for us as much as the volunteers, to find out how well this works on the scale of a very large monitoring project. But we hope that it will ultimately be more straightforward than our pilot protocol.

Things we are not changing

Throughout all of this, and hopefully it’s somewhat evident in the spirit of this [cough, cough: rather lengthy] post, we will maintain our commitment to communication, transparency, and good humor. We are growing dramatically this year, increasing to a target of 30 sites from 7 last year. We have tried to do our homework to pave the way for this growth, and we feel there is a compelling need to cast a wide net with the work we are doing protecting Washington shorelines.

We won’t likely be changing our protocol every year – not least because it makes it impossible to compare year to year, which is the whole point of this program! But we will be looking for better ways to describe and teach the protocol, and tools to make it easier for volunteers to do their work. In that spirit, we still want to hear from our volunteers. We will still be collecting feedback, informally along the way, and formally at training workshops and at the end of the season. We also invite volunteers participate in the analysis and use of the data that come from this project. What questions do you have that we might try to answer? What patterns do you see at one site that we might test across all of our sites? We’re all ears!

We take this approach – adaptive management, dialogue, and partnership – because we believe it will make our program better than if it were just the four of us sitting around a table. This work is also a part of what we can do for volunteers to say “THANK YOU” for working with us, donating your time, your expertise, and for supporting our project – Thank you for being our partners.

-Emily Grason, Project Coordinator

March 16, 2016

Read the previous post on volunteer feedback

 

Return to News page.
Return to Crab Team home page.

0