Do You See What I See? Quantifying Inter-Observer Variability in an Intertidal Marine Citizen Science Experiment

Hannah Earp*, Siobhan Vye, Katrin Bohn, Michael Burrows, Jade Chenery, Stephanie Dickens, Charlotte Foster, Hannah Grist, Peter Lamont, Sarah Long, Zoe Morrall, Jacqueline Pocklington, Abigail Scott, Gordon Watson, Victoria West, Stuart Jenkins, Jane Delany, Heather Sugden

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Downloads (Pure)

Abstract

Citizen science represents an effective means of collecting ecological data; however, the quality/reliability of these data is often questioned. Quality assurance procedures are therefore important to determine the validity of citizen science data and to promote confidence in conclusions. Here, data generated by a marine citizen science project conducted at 12 sites across the United Kingdom was used to investigate whether the use of a simple, low-taxonomic-resolution field-monitoring protocol allowed trained citizen scientists to generate data comparable to those of professional scientists. To do this, differences between field estimates of algal percentage cover generated by different observer units (i.e., trained citizen scientists, professional scientists, and combined units), and digitally derived baseline estimates were examined. The results show that in the field, citizen scientists generated data similar to those of professional scientists, demonstrating that training, coupled with the use of a simple, low-taxonomic-resolution protocol can allow citizen scientists to generate robust datasets in which variability likely represents ecological variation/change as opposed to observer variation. The results also show, irrespective of observer unit, that differences between field and digital baseline estimates of algal percentage cover were greatest in plots with medium levels of algal cover, highlighting that additional/enhanced training for all participants could be beneficial in this area. The approach presented can serve as a guide for existing and future projects with similar protocols to assess their data quality, to strengthen participant training/protocols, and ultimately to promote the incorporation of robust citizen science datasets into environmental research and management.

Original languageEnglish
Number of pages12
JournalCitizen Science: Theory and Practice
Volume7
Issue number1
DOIs
Publication statusFirst published - 4 May 2022

Bibliographical note

Publisher Copyright:
© 2022 Citizen Science: Theory and Practice. All rights reserved.

Keywords

  • Coral Point Count
  • data accuracy
  • data verification
  • public participation
  • temperate rocky shore
  • volunteer

Fingerprint

Dive into the research topics of 'Do You See What I See? Quantifying Inter-Observer Variability in an Intertidal Marine Citizen Science Experiment'. Together they form a unique fingerprint.

Cite this