IDEAS home Printed from https://ideas.repec.org/a/gam/jdataj/v2y2017i4p35-d116113.html
   My bibliography  Save this article

Earth Observation for Citizen Science Validation, or Citizen Science for Earth Observation Validation? The Role of Quality Assurance of Volunteered Observations

Author

Listed:
  • Didier G. Leibovici

    (Nottingham Geospatial Science, University of Nottingham, Nottingham NG7 2TU, UK)

  • Jamie Williams

    (Environment Systems Ltd., Aberystwyth SY23 3AH, UK)

  • Julian F. Rosser

    (Nottingham Geospatial Science, University of Nottingham, Nottingham NG7 2TU, UK)

  • Crona Hodges

    (Earth Observation Group, Aberystwyth University Penglais, Aberystwyth SY23 3JG, UK)

  • Colin Chapman

    (Welsh Government, Aberystwyth SY23 3UR, UK)

  • Chris Higgins

    (EDINA, University of Edinburgh, Edinburgh EH3 9DR, UK)

  • Mike J. Jackson

    (Nottingham Geospatial Science, University of Nottingham, Nottingham NG7 2TU, UK)

Abstract

Environmental policy involving citizen science (CS) is of growing interest. In support of this open data stream of information, validation or quality assessment of the CS geo-located data to their appropriate usage for evidence-based policy making needs a flexible and easily adaptable data curation process ensuring transparency. Addressing these needs, this paper describes an approach for automatic quality assurance as proposed by the Citizen OBservatory WEB (COBWEB) FP7 project. This approach is based upon a workflow composition that combines different quality controls, each belonging to seven categories or “pillars”. Each pillar focuses on a specific dimension in the types of reasoning algorithms for CS data qualification. These pillars attribute values to a range of quality elements belonging to three complementary quality models. Additional data from various sources, such as Earth Observation (EO) data, are often included as part of the inputs of quality controls within the pillars. However, qualified CS data can also contribute to the validation of EO data. Therefore, the question of validation can be considered as “two sides of the same coin”. Based on an invasive species CS study, concerning Fallopia japonica (Japanese knotweed), the paper discusses the flexibility and usefulness of qualifying CS data, either when using an EO data product for the validation within the quality assurance process, or validating an EO data product that describes the risk of occurrence of the plant. Both validation paths are found to be improved by quality assurance of the CS data. Addressing the reliability of CS open data, issues and limitations of the role of quality assurance for validation, due to the quality of secondary data used within the automatic workflow, are described, e.g., error propagation, paving the route to improvements in the approach.

Suggested Citation

  • Didier G. Leibovici & Jamie Williams & Julian F. Rosser & Crona Hodges & Colin Chapman & Chris Higgins & Mike J. Jackson, 2017. "Earth Observation for Citizen Science Validation, or Citizen Science for Earth Observation Validation? The Role of Quality Assurance of Volunteered Observations," Data, MDPI, vol. 2(4), pages 1-20, October.
  • Handle: RePEc:gam:jdataj:v:2:y:2017:i:4:p:35-:d:116113
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2306-5729/2/4/35/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2306-5729/2/4/35/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Franzoni, Chiara & Sauermann, Henry, 2014. "Crowd science: The organization of scientific research in open collaborative projects," Research Policy, Elsevier, vol. 43(1), pages 1-20.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Julian Koch & Simon Stisen, 2017. "Citizen science: A new perspective to advance spatial pattern evaluation in hydrology," PLOS ONE, Public Library of Science, vol. 12(5), pages 1-20, May.
    2. Carolin Haeussler & Henry Sauermann, 2016. "The Division of Labor in Teams: A Conceptual Framework and Application to Collaborations in Science," NBER Working Papers 22241, National Bureau of Economic Research, Inc.
    3. Benedikt Fecher & Sascha Friesike & Marcel Hebing, 2014. "What Drives Academic Data Sharing?," SOEPpapers on Multidisciplinary Panel Data Research 655, DIW Berlin, The German Socio-Economic Panel (SOEP).
    4. Héloïse Berkowitz, 2020. "Participatory Governance for the Development of the Blue Bioeconomy in the Mediterranean Region," Working Papers hal-02555685, HAL.
    5. Koehler, Maximilian & Sauermann, Henry, 2024. "Algorithmic management in scientific research," Research Policy, Elsevier, vol. 53(4).
    6. Qin Ye & Xiaolei Xu, 2021. "Determining factors of cities’ centrality in the interregional innovation networks of China’s biomedical industry," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 2801-2819, April.
    7. Haeussler, Carolin & Sauermann, Henry, 2013. "Credit where credit is due? The impact of project contributions and social factors on authorship and inventorship," Research Policy, Elsevier, vol. 42(3), pages 688-703.
    8. Kim, Hongbum & Shin, Dong-Hee & Lee, Daeho, 2015. "A socio-technical analysis of software policy in Korea: Towards a central role for building ICT ecosystems," Telecommunications Policy, Elsevier, vol. 39(11), pages 944-956.
    9. Mindel, Vitali & Overstreet, Robert E. & Sternberg, Henrik & Mathiassen, Lars & Phillips, Nelson, 2024. "Digital activism to achieve meaningful institutional change: A bricolage of crowdsourcing, social media, and data analytics," Research Policy, Elsevier, vol. 53(3).
    10. Benedikt Fecher & Gert G. Wagner, 2016. "Open Access, Innovation, and Research Infrastructure," Publications, MDPI, vol. 4(2), pages 1-8, June.
    11. Antonello Cammarano & Vincenzo Varriale & Francesca Michelino & Mauro Caputo, 2022. "Open and Crowd-Based Platforms: Impact on Organizational and Market Performance," Sustainability, MDPI, vol. 14(4), pages 1-26, February.
    12. Abhishek Nagaraj, 2018. "Does Copyright Affect Reuse? Evidence from Google Books and Wikipedia," Management Science, INFORMS, vol. 64(7), pages 3091-3107, July.
    13. Christian Matt & Christian Hoerndlein & Thomas Hess, 2017. "Let the crowd be my peers? How researchers assess the prospects of social peer review," Electronic Markets, Springer;IIM University of St. Gallen, vol. 27(2), pages 111-124, May.
    14. Järvi, Kati & Almpanopoulou, Argyro & Ritala, Paavo, 2018. "Organization of knowledge ecosystems: Prefigurative and partial forms," Research Policy, Elsevier, vol. 47(8), pages 1523-1537.
    15. Ethan Mollick & Ramana Nanda, 2016. "Wisdom or Madness? Comparing Crowds with Expert Evaluation in Funding the Arts," Management Science, INFORMS, vol. 62(6), pages 1533-1553, June.
    16. Jorge Faleiro & Edward Tsang, 2018. "Black Magic Investigation Made Simple: Monte Carlo Simulations and Historical Back Testing of Momentum Cross-Over Strategies Using FRACTI Patterns," Papers 1808.07949, arXiv.org.
    17. Huang, Jiekun, 2018. "The customer knows best: The investment value of consumer opinions," Journal of Financial Economics, Elsevier, vol. 128(1), pages 164-182.
    18. Kamilla Kohn Rådberg & Hans Löfsten, 2023. "Developing a knowledge ecosystem for large-scale research infrastructure," The Journal of Technology Transfer, Springer, vol. 48(1), pages 441-467, February.
    19. Susanne Beck & Maral Mahdad & Karin Beukel & Marion Poetz, 2019. "The Value of Scientific Knowledge Dissemination for Scientists—A Value Capture Perspective," Publications, MDPI, vol. 7(3), pages 1-23, July.
    20. Perkmann, Markus & Salandra, Rossella & Tartari, Valentina & McKelvey, Maureen & Hughes, Alan, 2021. "Academic engagement: A review of the literature 2011-2019," Research Policy, Elsevier, vol. 50(1).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jdataj:v:2:y:2017:i:4:p:35-:d:116113. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.