Categories
exam readings public participation in science research methods/philosophy science communication science studies

Exam readings: Public participation in science

Well, here they are: my last three readings for my public understanding of science reading list. After this, I’ll be spending the next week thinking ONLY about my first exam, which is coming up… And I will be presenting a paper at a conference this weekend- but more on that anon.

Anyway, here are the last three readings. These are all gray literature, but give a current overview of at least NSF’s thinking about the field of PUoS:

First: Friedman, Alan J., Sue Allen, Patricia B. Campbell, Lynn D. Dierking, Barbara N. Flagg, Cecilia Garibay, Randi Korn, Gary Silverstein, and David A. Ucko. “Framework for Evaluating Impacts of Informal Science Education Projects.” Washington, D.C.: National Science Foundation, 2008.

Summary: Report from a Natl. Science Foundation workshop on informal science education (ISE) in STEM fields; provides a framework for summative evaluation of projects that will facilitate cross-comparison. The authors identify six broad categories of impact: awareness, knowledge, and understanding; engagement or interest; attitude; behavior; skills; and “other” (project-specific impacts.) For funding purposes, proposals must outline their goals in these categories- while this won’t fully capture learning putcomes, in provides baseline information for evaluating the field of ISE. Also provides advice and suggestions, e.g., what to think about when coming up with goals, what approaches to take, how to evaluate, and how to document unexpected outcomes. It also discusses evaluation designs: NSF’s preference is for randomized experiments, but general advice is to use the most rigorous methods available (e.g., ethnography, focus groups)- discusses pros and cons of various methods. Some specific considerations for ISE evaluation include different starting knowledge of participants; assessments should be inclusive to those from different backgrounds (draw pictures, narratives, etc.) Also discuss specific methods, potential problems, how to assess impact categories for various types of projects (e.g., exhibits, educational software, community programs.)

Comments: Report is targeted to researchers being funded by NSF, to help them navigate new reporting requirements for projects with a public education component. Not useful for my purposes for theoretical background, but does give an outline of the current state of thinking of the NSF for this field.

Links to: Bonney et al. (use this framework for their report); Shamos (discusses different types of evaluation of scientific literacy)

Second: McCallie, Ellen, Larry Bell, Tiffany Lohwater, John H. Falk, Jane L. Lehr, Bruce V. Lewenstein, Cynthia Needham, and Ben Wiehe. “Many Experts, Many Audiences: Public Engagement with Science and Informal Science Education.” Washington, D.C.: Center for Advancement of Informal Science Education. 2009.

Summary: Study group report on public engagement with science (PES) in the context of informal science education- the focus is on describing/defining this approach. PES projects by definition should incorporate mutual discussion/learning among public and experts, facilitate empowerment/new civic skills, increased awareness of science/society interactions, and recognition of multiple perspectives or domains of knowledge. This approach is most common in areas of new science or controversy; the authors mention that the idea is not to water down the science, but to bring social context into the discussion. There are two general forms of PES in informal science education (ISE) projects: “mechanisms” (mutual learning is part of the experience- blogs, discussions) and “perspectives” (no direct interaction, but recognition of multiple values-e.g., incorporating multiple perspectives into an exhibit.) They contrast this approach with two views of traditional PUoS (making knowledge more accessable/engaging): the first view (generally held by ISE practitioners) sees PUoS as a public service; the second view (generally an academic STS/science communication perspective) sees PUoS as non-empowering, based on a deficit model, and not recognizing that the public can be critical consumers or even producers of science. PES arises from this second view: the key is that organizations must think critically about publics and experts are positioned in interactions, and bring in “mutual learning.”

Comments: While the authors recognize that “engagement” has multiple meanings (action/behavior, learning style, overall learning, participation within a group), the PES approach is not about directly influencing public policy or the direction of research. Presumably that approach is too activist(?)- they do mention the need to work toward using PES to affect policy/research. This report seems to take as a given that mutual dialogue between public and experts is a good thing; I’m not sure how well it would make that case to organizations who are skeptical about that approach.

Links to: Trench-“Analytical Framework” (assessment of the place of “engagement” model)

Third: Bonney, Rick, Heidi Ballard, Rebecca Jordan, Ellen McCallie, Tina Phillips, Jennifer Shirk, and Candie C. Wilderman. “Public Participation in Scientific Research: Defining the Field and Assessing Its Potential for Informal Science Education.” Washington, D.C.: Center for Advancement of Informal Science Education. 2009.

Summary: Study report on public participation in science research (PPSR) as part of informal science education (ISE.) History of ISE: began as public understanding of science (PUoS)- experts determined what public should know, explanations should lead to greater knowledge, which should lead to greater appreciation. Shortcomings of PUoS are that people have greater engagement when topic is directly relevant or interactive; focus is on content delivery, rather than understanding scientific processes. PPSR projects (citizen science, volunteer monitoring, etc.) ideally lead to learning both content and process. These projects involve public in the various stages of the scientific process to some degree. Three types: contributory (scientists design, public just gathers data), collaborative (scientists design, public helps refine, analyze, communicate), and co-created (designed by both and at least some public participants involved in all steps.) They evaluated 10 existing projects using Friedman at al.’s rubric; potential in PPSR projects to address all categories of impacts. Future opportunities include developing new projects (new questions, engage new audiences, test new approaches), enhance current PPSR projects (e.g., go from contributory to collaborative or co-created), add PPSR elements to other types of ISE projects, and enhance research/evaluation of PPSR projects. Two final recommendations are that projects should do a better job of articulating learning goals/outcomes at the beginning, and that comprehensive evaluation methods should be developed.

Comments: This committee report offers a current assessment of PPSR projects and synthesizes recommendations for future research. Scientific literacy remains a basic individual measure in this framework, even with the emphasis on participatory interaction (in contrast to social constructivist approach.) While the assumption is that PPSR projects do affect understanding of science, there are large challenges to assessing this, even at an individual level; part of the problem is that this type of assessment is often added post hoc.

Links to: Roth & Lee (conceptualize sci. literacy in PPSR as a communal property, not individual); Friedman at al. (framework for evaluating PPSR projects)

One reply on “Exam readings: Public participation in science”

Leave a Reply

Your email address will not be published. Required fields are marked *