Context Matters

Accounting for Item Features in the Assessment of Citizen Scientists’ Scientific Reasoning Skills

authored by
Till Bruckermann, Tanja Straka, Milena Stillfried, Moritz Krell
Abstract

Citizen science (CS) projects engage citizens for research purposes and promote individual learning outcomes such as scientific reasoning (SR) skills. SR refers to participants’ skills to solve problems scientifically. However, the evaluation of CS projects’ effects on learning outcomes has suffered from a lack of assessment instruments and resources. Assessments of SR have most often been validated in the context of formal education. They do not contextualize items to be authentic or to represent a wide variety of disciplines and contexts in CS research. Here, we describe the development of an assessment instrument that can be flexibly adapted to different CS research contexts. Furthermore, we show that this assessment instrument, the SR questionnaire, provides valid conclusions about participants’ SR skills. We found that the deep-structure and surface features of the items in the SR questionnaire represent the thinking processes associated with SR to a substantial extent. We suggest that practitioners and researchers consider these item features in future adaptations of the SR questionnaire. This will most likely enable them to draw valid conclusions about participants’ SR skills and to gain a deeper understanding of participants’ SR skills in CS project evaluation.

Organisation(s)
Institute of Education
External Organisation(s)
Technische Universität Berlin
Leibniz Institute for Zoo and Wildlife Research (IZW)
IPN - Leibniz Institute for Science and Mathematics Education at Kiel University
Type
Article
Journal
Citizen Science: Theory and Practice
Volume
6
No. of pages
15
ISSN
2057-4991
Publication date
25.11.2021
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
General
Electronic version(s)
https://doi.org/10.5334/cstp.309 (Access: Open)