Categories
evolution meetings pedagogy science communication science studies

Impressions from NARST

Earlier this week, I attended a conference of the National Association of Research in Science Teaching. I wasn’t presenting anything (missed the submission deadline), but it turned out to be fairly worthwhile. I ended up only attending two days of the conference, and focusing primarily on the digital tools/informal science sessions. I did get the chance to chat with a few people about my work and make some connections, which is always nice.

Here are just a few impressions from the conference.

  • The digital media tools used for science education seemed to mainly fall into two categories: simulations for teaching science concepts, and simulations for assessment purposes. (This is probably not a very profound observation.) The former seems to be the more ‘traditional’ tools, e.g., using racing games or pinball-esque scenarios to teach about physics. The latter are newer to me, at least, and are significant in that they represent an attempt to get away from multiple-choice tests for testing inquiry. There were some neat ideas from both these general categories.
  • The digital tools for informal learning were more wide-ranging, which you’d expect. There were some cool demos here; two I found interesting were FoldIt (which turns protein-folding problems into crowdsourced puzzle games) and Dancing the Earth (which uses a mixed-reality simulation to teach astronomy concepts).
  • The session that was probably most useful for me immediately was one on problems in teaching evolution. Some of the bigger conceptual issues raised here were: the challenge of linking evolutionary processes at different scales (e.g., population dynamics & speciation), teaching students to differentiate between useful and non-useful types of evidence, and difficulties with reading phylogenetic trees.
  • I also went to a session on philosophy of science, objectivity, and teaching about pseudoscience. Some of the ideas from this session would be useful if I ever did teach science again, since it was more geared toward educators. One presentation in particular stands out, on the subject of teaching science in communities which place a high level of emphasis on traditional ecological knowledge. The presenter tried to lay out a strategy that charts a middle course between immediate rejection or fuzzy acceptance of TEK, by focusing on talking about cultural technologies, rather than immediately comparing philosophies. The idea seems to be to focus on areas where there’s common ground (i.e., observation, testing, and building technologies in both traditional cultures and science), rather than immediately alienating students by dismissing their culture or dismissing science as a specialized way of understanding the world. This is an interesting idea to think about.
  • Finally, trying to present via Skype is just asking for trouble. I attended one session (a digital media session, naturally) in which two presenters were going to present via Skype. Even though everything was clearly set up and working during the break before the session, when it came time to present, something went wrong with the sound on someone’s end. The two presenters ended up being able to give their talks, after much technical tweaking, but this did not go smoothly.
Categories
exam readings pedagogy tech design

Exam readings: computer-based learning

Today’s two readings weren’t exactly what I expected, being more focused on general ideas about how to use electronic technologies in the classroom than on concrete recommendations for augmenting learning electronically. That said, both articles are relatively dated (apparently, anything from the grunge era is now dated, and that includes journal articles), so I should have probably expected them to be less than up-to-date…

Roy D. Pea. “Augmenting the Discourse of Learning With Computer-Based Learning Environments.” in Erik De Corte, Marcia C. Linn, Heinz Mandl, and Lieven Verschaffel (eds.) Computer-Based Learning Environments and Problem Solving, pp 313-343. New York: Springer-Verlag, 1992.

Summary: Pea’s focus in this paper is on using electronic technologies to augment “learning conversations” that help students become participants in communities of practice. Basically, he is interested in the social aspects of cognition. In the communities of practice view, learning is integral to becoming a member of a community and maintaining membership; participation in the community, rather than information transfer, is what facilitates learning. Conversation is a key part of this process; learning conversations involve creation of communication and interpretation of meanings- constructing common ground among participants. In science, students need to be able to “talk science,” rather than just listening to lectures and reading textbooks. Learning language and other symbols (e.g., diagrams) and being able to “converse” with them is a large part of enculturation; this involved discussion of how different representations relate to one another and to the physical world. Enculturation/increasing participation occurs via appropriation and interpretation of language and symbols. Computer tools for learning can’t teach discourse directly, but can provide tools for developing skill in working with representations, as well as augmenting learning conversations. Discusses a case study of developing a system for creating interactive optics diagrams. One key suggestion is that such tools should create affordances that facilitate: production of visualizations, allow interpretation, and create “sense-making” (causal) narratives.

Comments: Background might be useful, but majority of paper is centered on classroom applications, and not as many concrete recommendations for design of such environments as I’d thought there would be (one case study).

Links to: Lave & Wenger (comm. of practice); Roth & McGinn (also focus on science learning via representations); Scardimalia & Bereiter (using computers more for communication in education)

Marlene Scardamalia and Carl Bereiter. “Computer Support for Knowledge-Building Communities.” Journal of the Learning Sciences 3(3): 265-283,1994.

Summary: The authors want to restructure schools as collective knowledge-building communities (KBCs). In these computer-supported intentional learning environments (CSILEs), patterns of discourse would mimic those of KBCs in the real world. Their ideas come from metacognitive learning, expertise-building via progressive problem-solving, and KBCs (here, schools would provide social support and a collective knowledge pool). They discuss ways schools inhibit this type of learning (e.g., individual focus, formal/demonstrable knowledge, lack of support for progressive problem-solving). The idea is to reframe general discourse around collaborative processes of research facilities (e.g., journal articles represent advances in knowledge, peer review is a way to validate this). While educational technology generally supports individualized learning & drill/test, they propose a different focus. CSILEs would include: a central database in which students would post “new knowledge,” features that let students comment on/build on contributions of others (structures communication around problems and building knowledge of the group, explicit discussions of metacognition, small-group discussions, and tools to support different media and students who contribute different dimensions of knowledge. General idea is that information access alone is not sufficient; you need both computer tools to explicitly build these communities and teacher strategies for promoting participation.

Comments: Less helpful for concrete ideas than I’d assumed-perhaps because dated? Focus is on schools, rather than informal learning. Might be able to extrapolate from these ideas, but not sure how this concept would translate to an informal setting.

Links to: Lave & Wenger?

Categories
exam readings learning theory pedagogy tech design

Exam readings: mental models and wireless devices

Not really related, but these two readings are on mental models (pretty theoretical) and things to think about when incorporating wireless devices into the classroom (more practical):

Nancy J. Nersessian. “Mental Models in Conceptual Change.” in Stella Vosniadou (ed.) International Handbook of Research on Conceptual Change, pp. 391-416. New York: Routledge, 2008.

Summary: Nersessian’s main idea is to outline a framework of how mental models work, and using that to support conceptual change (Kuhnian “paradigm shifts” in science & for science learners)- she spends most time on the former. One mechanism for change is building new mental models & conceptual structures. Aspects of mental model framework are debated; one constant is that mental representations are organized into some sort of units with a relational structure. Approach assumes that: “internal” & “external” are valid categories; internal symbolic structure is iconic (perceptual, properties based on those of objects in external world) rather than rule-based or linguistic; skill in modeling is partially biological, partially from learning in social/natural contexts. Discusses four different strands in research: “discourse models” derived primarily from language/instruction (we mentally manipulate these ideas by models, not words); spatial simulation (which seems to be perceptually-based, but not entirely visual); “mental animation” (more advanced-requires causal/behavioral knowledge); and internal-external coupling (we should define external representations as part of our extended cognitive capacities). Idea of embodied representation is that perceptual experience is fundamentally tied to mental modeling processes. Entire system has both modal and amodal aspects; some concepts & processes are grounded in context, others aren’t. For conceptual change, need to explicitly run people through model-changing activities; abstract activities can provide support in the form of mental inventories of affordances and constraints in different domains.

Comments: Specific tools that participate in coupled internal-external representational systems are “cognitive artifacts.” These include writing & diagrams; function as external and social memory supports. Ties together the cognitive model approach with theories of social and distributed cognition.

Links to: Rapp & Kurby (perceptual vs. amodal models of cognition); Lave & Wenger (social cognition); Zhang & Norman (internal-external coupling)

Jeremy Roschelle, Charles Patton, and Roy D. Pea. “To Unlock the Learning Value of Wireless Mobile Devices, Understand Coupling.” Proceedings of the IEEE International Workshop on Wireless and Mobile Technologies in Education, 2002.

Summary: The authors feel that handheld computers (wireless internet learning devices-WILDs) could become ubiquitous in classrooms, but conceptual issues need to be resolved before using them on large scale. The issue they focus on is “coupling” between social & informatic worlds with different expectations. Challenges are political, organizational, pedagogical: e.g., how/who to control messaging tech, how to regulate roles in shared info space, how should learning resources be stored & accessed, who decides about privacy levels, and how integrated or segregated should students’ learning environments be. Big issue is who will make these decisions- suggest that these things need to be worked out, or will risk rejection of these tools by students, teachers, or others. They focus on three main design problems. 1) Curricular activity spaces vs. personal learning connections: students perceive devices as comm. tools, teachers want to use to augment classroom activities (students may need separate devices for class). 2) Integrated vs. synchronized educational databases: what info will be centralized & who will have access to it. 3) Broad vs. narrow technological mediation of discourse: face-to-face interaction still important; may want to take minimal mediation route,

Comments: The authors outline some critical issues to take into account before using these devices in a classroom setting. Some of these things to think about would apply to informal settings as well, e.g., how much mediation ,what info is being stored by system (if any). Probably tangentially related to my project.

Links to: Sharples et al. (these concerns relate to AT framework for understanding learning tool use); Borgmann et al. (cyberlearning)

Categories
exam readings networks pedagogy tech design

Exam readings: networked tech and STEM learning

Two related readings on networked technologies and science: the first a NSF task report on cyberlearning, and the second on “collaboratories”-collaborative laboratories.

Christine L. Borgman, Hal Abelson, Lee Dirks, Roberta Johnson, Kenneth R. Koedinger, Marcia C. Linn, Clifford A. Lynch, Diana G. Oblinger, Roy D. Pea, Katie Salen, Marshall S. Smith, and Alex Szalay. “Fostering Learning in the Networked World: The Cyberlearning Opportunity and Challenge.” Washington, DC: National Science Foundation, 2008.

Summary: Task force report designed to give NSF guidance on cyberlearning: “networked computing and communication technology to support learning.” Their focus is on using CL to support STEM education in a lifelong, customized setting- redistributing learning over space & time. The authors believe there’s a high potential now because of new technologies, increased understanding of learning processes, demand for solutions to educational problems. Some examples: Web tech & breaking down location barriers, open & multimedia educational resources, new techs. making learning affordable & accessible, cloud computing, customizable content, and an enthusiastic audience (though schools aren’t up to speed on digital techs.) Key potential problems: responsible data use/data overload, scaling technologies for large communities, how to apply software & other resources. Several issues require action: data management, open/accessible resources need to be guaranteed, NSF needs strategy of funding projects that produce resources for both education & research. They have 5 main recommendations, including a “platform perspective” (shared & interoperable designs), resources developed should be open & freely shared.

Comments: Apparently, while the public doesn’t respect education, we do like electronic gadgets- so the idea is to use these to educate people. This reference will mainly be useful for giving me a sense of the state of the field. They do ask one question that’s interesting: should we train people to work in interdisciplinary teams, or increase the versatility of individuals? (trend seems to be to work in teams…)

Links to: Finholt (collaboratories)

Thomas Finholt. “Collaboratories.” Annual Review of Information Science and Technology. 36(1): 73-107, 2002.

Summary: “Collaboratories”=collaborative laboratories or “labs without walls;” joint science work has historically depended on physical proximity, esp. science with large instruments (or specific study sites). While one answer has been residencies, the problems of this structure have remained, primarily barriers to access or research. Science has been moving toward large, complex distributed projects- can consider these a types of distributed intelligence. Collaboratories require two types of IT: increased communication + better access to instruments and data (data sharing/data viz. tools, remote-use instruments). Finholt discusses history of such projects, from “memex” concept & ARPAnet to current projects in various disciplines. These still involve a small number of participants; libraries and datasets have more use. Other lessons: people can use them sporadically and still be useful, easily integrated software is more accepted (e.g., web-based), some types of activity are naturally more collaboratory (data coll. vs. idea generation), and there are new expectations for participants. Challenges: moving from shared space to virtual space introduces new demands: must make implicit interactions explicit (e.g., pointing, gaze detection), willingness to collaborate and adopt tools is also issue.

Comments: Points out that increased communication can lead to Balkanization as well as broader communication (social exclusivity); benefits are highest for students & non-elite scientists, drawback might be these projects becoming pools for marginalized scientists (e.g., e-journals have lower status). F2F interactions still crucial for establishing contacts; meetings still important- these projects will augment, rather than replace current practice.

Links to: Howe (crowdsourcing)

Categories
exam readings learning theory pedagogy

Exam readings: Activity theory

Activity theory seems to be popular in the educational community. I’ll be reading a few articles that involve it, but I’m still not sure how/if it will fit in with my overall project goals, as it’s used more in formal pedagogical design than for informal learning. Here are two readings that involve it:

Wolff-Michael Roth. “Activity Theory and Education: An Introduction.” Mind, Culture, and Activity 11(1): 1-8, 2004.

Summary: Introduction to a special issue; focuses on several key points about AT. Interest in AT has been increasing in educational circles; the core idea is that individuals have power to transform their communities through their activities (Marxist basis). First, the triangle model (subject, object, community, within tools/means, division of labor, rules) is dynamic, not static (see below for model). The subject & object are in a dialectical relationship; a contradiction between the subject’s mental image and the physical object drives action (e.g., a sculptor will keep sculpting until the sculpture matches her mental image). There’s also overall change- any human activity results in change in all elements in the system (e.g., learning through participation also constitutes participation as having effects on the wider group). Second, individuals produce outcomes, but participation also produces the structure of the community (and his/her overall position as a member of the community)- production drives the historical trajectory of the system. Third, internal contradictions drive the internal system activity- the main one being tensions between individual production and societal production (e.g., crime-fundamental contradiction between societal constraints and the individual actions that are best for society). There are four types of contradictions: within each system component, between components, between system objects of different activity systems, and between system components of different activity systems.

Comments: Gives some examples of contradictions that are present in educational settings, but would have been nice if these examples were explicitly matched up to the 4 types of contradictions. Mentions directions for future research (e.g., what is the nature of change in activity systems); also mentions that dialectical approach might fit poorly with western dualistic systems. This framework is applicable to HCI, but have to put more thought into how it might fit with other stuff.

Links to: Suchman, Sharples et al. (AT examples)

Activity system model from http://www.quasar.ualberta.ca/edpy597mappin/modules/module15.html

Mike Sharples, Josie Taylor and Vavoula, Giasemi. “A Theory of Learning for the Mobile Age.” in Richard Andrews and Caroline Haythornthwaite (eds.) The Sage Handbook of E-learning Research, pp. 221–247. London: Sage, 2007.

Summary: The authors use a conversational model and activity theory as a framework for mobile learning (informal, either using mobile tech. or learning while mobile). They frame it as interaction between a learner and technology to advance knowledge. First, conversation, negotiation, and interpretation drive overall learning (“conversation”-sharing of understanding w/in a pervasive medium- this defn. includes human-machine interaction); it’s about becoming informed about others’ representations. 2-level model for learning: acting (problem solving/model building) & description (demonstration/explanation) + constant internal representation. Within this model, teachers/experts don’t really derive authority through expertise, but rather through negotiation (they recognize that this model doesn’t quite apply to a classroom setting). Second, their AT framework describes how tool use helps people learn includes 1st triangle (subject/learner, object/task, community) plus 2nd triangle which mediates 1st (rules/norms, division of labor, tools-physical + signs). The tools (both semiotic and technical) constrain & support learners in goal of transforming their knowledge/skills. Dialectical interaction between nodes in the triangle drives learning; the idea is to use this as a framework to pinpoint “tensions” in the user-tool system that inhibit learning. Agency in learning is a system property, not that of individuals. They describe a case study of mobile technology use in a museum using this framework.

Comments: Mention digital divide, but point out that mobile technologies are being adopted in many places w/o traditional infrastructure. AT framework seems more like a model than a predictive theory, unless the prediction is that when all components are working, learning will occur. The conversational model sets up learning as a process of negotiation, and the AT model describes how tool use facilitates this. The AT aspect seems to be more as an analysis tool that helps design technologies to enhance “conversations” in informal learning settings (not replace traditional learning).

Links to: Roth, Suchman (activity theory)

Categories
exam readings pedagogy research methods/philosophy science studies

Exam reading: “The myth of scientific literacy”

Morris Shamos’ “Myth of Scientific Literacy” starts off with some grim estimates on the state of scientific literacy in the U.S.: maybe 5-7% of Americans are scientifically literate- able to not only understand science terminology and know some basic facts, but also understand how the scientific process works. While this book was published in 1995, the situation hasn’t changed much.

Summary: Shamos claims that U.S. educational policy (in many iterations) has been trying to increase general science literacy and increase numbers of science-career students, and failing at both. Science is difficult because it requires a non-commonsense mode of thought; deductive/syllogistic thinking (commonsense) can lead to correct conclusions from incorrect assumptions, and science rests on a combination of deduction, induction, quantitative reasoning, and experimentation. Through the history of science education, there has been debate over what to teach and why; Shamos suggests three levels of sci. literacy: cultural (understand some terminology), functional (know some facts), and “true” literacy (understand scientific process). “Science” education generally is focused on technology or natural history studies (not sci. process)- which would be OK for “science awareness,” but also need to add an understanding of the use of experts to assist in making societal decisions. Broad-based sci. literacy is hampered by several factors: mathematical illiteracy, lack of social incentives, science can be boring & hard to learn, and disparagement by public intellectuals (and others.) He especially cautions against movements to discredit rationalism as the best basis with which to relate nature to society through science.

Comments: On use of experts: failing to create a truly sci. literate citizenry (which Shamos suggests is impossible), he suggests a system of public science experts who help make decisions in a transparent way (with citizen watchdog groups.) Overall, wide-ranging discussion of science education, philosophy of science, and possible future models for science education (also incorporating adult ed, though he focuses on formal ed.)

Links to: Pellegrini (models of citizen-scientist expert interactions); Holton (“anti-science” forces)

Categories
exam readings knowledge work pedagogy tech design

Exam reading: “Datacloud”

I often wonder about positive interpretations of the new, “postmodern” information-dense and chaotic work environment. For example, how well will this exciting new world of info-surfing as a model hold up, given recent evidence that we really can’t multitask? And there are also significant issues skipped over in most discussions of the changing work environment: the wide divergence of incomes between certain classes of knowledge workers and non-knowledge workers, the digital divide, and class stratification.

I certainly don’t know how these things are going to play out. But here’s another exam reading that doesn’t really address them head-on: Johndan Johnson-Eilola’s “Datacloud.” It’s probably a scope issue- he does at least mention these issues, but his focus is clearly elsewhere:

Summary: In this book, Johnson-Eilola tries to describe changes in the work environment occurring in information-based jobs, and how both education and computer workspaces should be changed to facilitate this new way of working. Describes standard model of how the “symbol-analytic” (S-A) workplace is becoming the new postmodern paradigm: fragmented, mobile, computerized. contingent, situation-specific solutions, under-defined goals, playful, and helping facilitate a concept of the self that is fluid and changeable. He focuses on how different articulations (“suggestions about acceptable meanings”) of technology can be sites of resistance to dominant cultural trajectories. “Articulation theory” is a postmodern adaptation of Marxism that says that subjects are constructed within social/class contexts, but that these sites of negotiation allow the subject some agency. His main focus is on how workspace design (mostly the computer interface, but also the physical space is important) can be changed to facilitate S-A work. S-A work requires the ability to navigate between complex spatial data representations, communicate at need with other workers, and be able to display some information in different, more permanent locations (e.g., whiteboards). Education spaces need to change to get students comfortable with these immersive work environments. Students also must learn how to be creative about representing and using information (rather than just using ppt or xls defaults).

Comments: Mentions the ideological nature of articulations (e.g., current word processing programs make it hard to work in a dynamically interlinked environment b/c of clunky embedding), but doesn’t go into too much detail about group-level politics. Characterizes hyperspace as linear on temporal scale, rather than as a fluid network that gives up temporality. Blogs are an example of new (book was published in 2005) emergent symbolic-analytic spaces: dynamic production sites with RSS feeds to let readers experience them in different temporal & spatial sequences.

Links to: Liu (historicizes symbolic-analytic work); Brown & Duguid (less focus on specifics of work envt.); Spinuzzi (approaches topic from network theory); Bolter (hypertext)

Categories
exam readings pedagogy research methods/philosophy transparency visuals

Exam reading: “E-crit”

This post is a summary of E-Crit: Digital media, critical theory, and the humanities by Marcel O’Gorman. I’ve read this book before and used some of the concepts in a paper- I thought I would read something that was a bit review after the last book I read… After reading Opening Spaces, it was interesting to see how this book really focuses on postmodern methods without taking ethical considerations into account (though political considerations are part of it). The intersection of these two texts makes me think of a series of blog posts on iblamethepatriarchy.com which look at the intersection of feminist criticism and postmodern evaluations of art (pretty thought-provoking). Anyway, one of the comments to a post there said that exposure to feminist interpretation ruins all art, because you can no longer look at art without thinking about the material and social conditions under which that art was made. (Not entirely sure what the connection is here, but O’Gorman does a lot of postmodern art analysis as part of his argument.) So if you’re an art lover, maybe better not to follow that link…

Summary: O’Gorman is trying to lay out a shift in academic methods that will revitalize humanities work by taking advantage of possibilities inherent in digital media.  For him, academic disciplines are fragmented, hierarchical and print-centered, which leads to interpretation (hermeneutics) and repetition rather than creativity (heuritics).  He foregrounds three types of “remainder”/“others” of academic discourse: puns/nonlinear transitions, digital media, and imagery.  He introduces “hypereconomy”- the use of “hypericons” to connect a network of discourses and lead to intuitive exploratory linkages between them. One big emphasis is on picture theory: images are subjective (non-transparent) and in a struggle with text (think LOLcats-text and image can be contradictory and create new meanings).  He contrasts the educational strategies of Ramus (classifying & compartmentalizing knowledge without reference to random mnemonic devices) to the work of Wm. Blake (image/text contradictions, opposition to creating conformist students).  He calls hypereconomy a “technoromantic” method of expression- using subjective, affect-based interpretations of print and images to create a bricolage of sorts.  These constructs incorporate four primary images: personal, historical, disciplinary, and pop-culture (he adds in a written interpretive component when assigning them in his classes).  Part of what they do is promote shifts in the figure/ground relationships in images (via subjective interpretations, “nonsense” connections, and hyperlinking).  O’Gorman speculates that constant exposure to visual stimuli is leading to increased abstract & spatial reasoning.  He concludes by laying out a plan to rejuvenate humanities departments by incorporating digital media studies and criticism: this would add technological “rigor” but still let departments teach criticism of the changing social/technological environment.

Comments: O’Gorman’s main focus seems to be the hypereconomy method as a tool for invention, and the call to incorporate digital media into humanities departments as a way to subvert “technobureaucratic” management of universities seems a bit tacked-on.  Some of the visual theory he builds his argument upon (e.g., Gombrich’s “mental set” of interpretations) isn’t empirically supported (as I recall).  His concepts about non-transparent visuals & language have been the most useful things for me.  I probably fall into the traditional linear-enlightenment camp & am not convinced that hyperconomy projects can actually lead to useful critiques of institutions (a bit too materialist, I guess).  When I first read this book, I had a much stronger reaction to the anti-Enlightenment thread that runs through it- I’m either becoming inured to such a position or starting to reconcile the cognitive dissonances from my previous training…

Links to: Bolter (remediation, transparency)