Categories
geekery tech design

What is that little ‘a’ with the circle around it? ‘At?’ ‘About?’

I just found this via Sociological Images and just had to post it. It’s a “Good Morning America” “Today Show” clip from 1994, in which Bryant Gumbel, Katie Couric, and a co-host try to decipher the ‘@’ symbol and explain what the ‘Internet’ is. They eventually have to ask one of the (presumably) tech guys behind the scenes for help.

So you have some context, this sure-to-be-a-classic conversation took place after the January Northridge earthquake in California.

Does this mean the “series of tubes” description is pretty historically recent?

Feb. 3: Looks like this Youtube video has been deleted- hope you got a chance to see it!

Categories
exam readings learning theory tech design

Exam reading: “Practices of distributed intelligence”

Today’s reading ties together several themes from other texts on my reading list: using visualizations & networked tools for understanding science, distributed cognition, activity theory, and participatory learning.

This is my last (!) exam reading- I’m on to taking my final exam this weekend. Next step will be fleshing out my project ideas, so more about that later…

Roy D. Pea. “Practices of Distributed Intelligence.” In Gavriel Simon (ed.) Distributed Cognitions: Psychological and Educational Considerations. Cambridge: Cambridge University Press, 1993.

Summary: Distributed intelligence framework has implications for educational tech., both computational and social. Knowledge is socially constructed through collaborative efforts, as well as distributed into tools (which are in turn designed by social decision process); however, people are the ones that perform cognition. Intelligence connects means to ends via behavioral or mental adaptations. Object affordances “link perception and action;” objects are designed to be “smart” and simplify our cognition (we don’t notice this when we get used to using them). This includes symbol systems- calculus, numbers, etc. Environmental cues (in objects) help us get from diffuse desires to concrete goals and plans for action. Discusses history of dist. intelligence: AT (people shape/are shaped by their environments in dialectical fashion), computers reorganize (not just augment) mental functions. Some key tools: science visualization tools, “guided participation,” situated cognition. We should teach students to use tools (esp. computers) with the idea that they will change what they need to know, rather than just increase task efficiency. Some trade-offs with this approach: access to activity vs. understanding its foundations, static task definitions vs. dynamic definitions (more difficult to design for dynamic tasks). The main idea is to teach students to use tools (alone or in groups), rather than for individual testing.

Comments: Ties to distributed cognition, visualization, activity theory, and participatory learning. While focus is on school settings, some of these concepts could apply to informal learning situations (e.g., affordances in tools/devices, distinction between doing an activity and actually understanding the concepts behind it).

Links to: Roth (AT); Nersessian (discusses dist. cog. and mental models)

Categories
exam readings pedagogy tech design

Exam readings: computer-based learning

Today’s two readings weren’t exactly what I expected, being more focused on general ideas about how to use electronic technologies in the classroom than on concrete recommendations for augmenting learning electronically. That said, both articles are relatively dated (apparently, anything from the grunge era is now dated, and that includes journal articles), so I should have probably expected them to be less than up-to-date…

Roy D. Pea. “Augmenting the Discourse of Learning With Computer-Based Learning Environments.” in Erik De Corte, Marcia C. Linn, Heinz Mandl, and Lieven Verschaffel (eds.) Computer-Based Learning Environments and Problem Solving, pp 313-343. New York: Springer-Verlag, 1992.

Summary: Pea’s focus in this paper is on using electronic technologies to augment “learning conversations” that help students become participants in communities of practice. Basically, he is interested in the social aspects of cognition. In the communities of practice view, learning is integral to becoming a member of a community and maintaining membership; participation in the community, rather than information transfer, is what facilitates learning. Conversation is a key part of this process; learning conversations involve creation of communication and interpretation of meanings- constructing common ground among participants. In science, students need to be able to “talk science,” rather than just listening to lectures and reading textbooks. Learning language and other symbols (e.g., diagrams) and being able to “converse” with them is a large part of enculturation; this involved discussion of how different representations relate to one another and to the physical world. Enculturation/increasing participation occurs via appropriation and interpretation of language and symbols. Computer tools for learning can’t teach discourse directly, but can provide tools for developing skill in working with representations, as well as augmenting learning conversations. Discusses a case study of developing a system for creating interactive optics diagrams. One key suggestion is that such tools should create affordances that facilitate: production of visualizations, allow interpretation, and create “sense-making” (causal) narratives.

Comments: Background might be useful, but majority of paper is centered on classroom applications, and not as many concrete recommendations for design of such environments as I’d thought there would be (one case study).

Links to: Lave & Wenger (comm. of practice); Roth & McGinn (also focus on science learning via representations); Scardimalia & Bereiter (using computers more for communication in education)

Marlene Scardamalia and Carl Bereiter. “Computer Support for Knowledge-Building Communities.” Journal of the Learning Sciences 3(3): 265-283,1994.

Summary: The authors want to restructure schools as collective knowledge-building communities (KBCs). In these computer-supported intentional learning environments (CSILEs), patterns of discourse would mimic those of KBCs in the real world. Their ideas come from metacognitive learning, expertise-building via progressive problem-solving, and KBCs (here, schools would provide social support and a collective knowledge pool). They discuss ways schools inhibit this type of learning (e.g., individual focus, formal/demonstrable knowledge, lack of support for progressive problem-solving). The idea is to reframe general discourse around collaborative processes of research facilities (e.g., journal articles represent advances in knowledge, peer review is a way to validate this). While educational technology generally supports individualized learning & drill/test, they propose a different focus. CSILEs would include: a central database in which students would post “new knowledge,” features that let students comment on/build on contributions of others (structures communication around problems and building knowledge of the group, explicit discussions of metacognition, small-group discussions, and tools to support different media and students who contribute different dimensions of knowledge. General idea is that information access alone is not sufficient; you need both computer tools to explicitly build these communities and teacher strategies for promoting participation.

Comments: Less helpful for concrete ideas than I’d assumed-perhaps because dated? Focus is on schools, rather than informal learning. Might be able to extrapolate from these ideas, but not sure how this concept would translate to an informal setting.

Links to: Lave & Wenger?

Categories
exam readings learning theory pedagogy tech design

Exam readings: mental models and wireless devices

Not really related, but these two readings are on mental models (pretty theoretical) and things to think about when incorporating wireless devices into the classroom (more practical):

Nancy J. Nersessian. “Mental Models in Conceptual Change.” in Stella Vosniadou (ed.) International Handbook of Research on Conceptual Change, pp. 391-416. New York: Routledge, 2008.

Summary: Nersessian’s main idea is to outline a framework of how mental models work, and using that to support conceptual change (Kuhnian “paradigm shifts” in science & for science learners)- she spends most time on the former. One mechanism for change is building new mental models & conceptual structures. Aspects of mental model framework are debated; one constant is that mental representations are organized into some sort of units with a relational structure. Approach assumes that: “internal” & “external” are valid categories; internal symbolic structure is iconic (perceptual, properties based on those of objects in external world) rather than rule-based or linguistic; skill in modeling is partially biological, partially from learning in social/natural contexts. Discusses four different strands in research: “discourse models” derived primarily from language/instruction (we mentally manipulate these ideas by models, not words); spatial simulation (which seems to be perceptually-based, but not entirely visual); “mental animation” (more advanced-requires causal/behavioral knowledge); and internal-external coupling (we should define external representations as part of our extended cognitive capacities). Idea of embodied representation is that perceptual experience is fundamentally tied to mental modeling processes. Entire system has both modal and amodal aspects; some concepts & processes are grounded in context, others aren’t. For conceptual change, need to explicitly run people through model-changing activities; abstract activities can provide support in the form of mental inventories of affordances and constraints in different domains.

Comments: Specific tools that participate in coupled internal-external representational systems are “cognitive artifacts.” These include writing & diagrams; function as external and social memory supports. Ties together the cognitive model approach with theories of social and distributed cognition.

Links to: Rapp & Kurby (perceptual vs. amodal models of cognition); Lave & Wenger (social cognition); Zhang & Norman (internal-external coupling)

Jeremy Roschelle, Charles Patton, and Roy D. Pea. “To Unlock the Learning Value of Wireless Mobile Devices, Understand Coupling.” Proceedings of the IEEE International Workshop on Wireless and Mobile Technologies in Education, 2002.

Summary: The authors feel that handheld computers (wireless internet learning devices-WILDs) could become ubiquitous in classrooms, but conceptual issues need to be resolved before using them on large scale. The issue they focus on is “coupling” between social & informatic worlds with different expectations. Challenges are political, organizational, pedagogical: e.g., how/who to control messaging tech, how to regulate roles in shared info space, how should learning resources be stored & accessed, who decides about privacy levels, and how integrated or segregated should students’ learning environments be. Big issue is who will make these decisions- suggest that these things need to be worked out, or will risk rejection of these tools by students, teachers, or others. They focus on three main design problems. 1) Curricular activity spaces vs. personal learning connections: students perceive devices as comm. tools, teachers want to use to augment classroom activities (students may need separate devices for class). 2) Integrated vs. synchronized educational databases: what info will be centralized & who will have access to it. 3) Broad vs. narrow technological mediation of discourse: face-to-face interaction still important; may want to take minimal mediation route,

Comments: The authors outline some critical issues to take into account before using these devices in a classroom setting. Some of these things to think about would apply to informal settings as well, e.g., how much mediation ,what info is being stored by system (if any). Probably tangentially related to my project.

Links to: Sharples et al. (these concerns relate to AT framework for understanding learning tool use); Borgmann et al. (cyberlearning)

Categories
exam readings networks pedagogy tech design

Exam readings: networked tech and STEM learning

Two related readings on networked technologies and science: the first a NSF task report on cyberlearning, and the second on “collaboratories”-collaborative laboratories.

Christine L. Borgman, Hal Abelson, Lee Dirks, Roberta Johnson, Kenneth R. Koedinger, Marcia C. Linn, Clifford A. Lynch, Diana G. Oblinger, Roy D. Pea, Katie Salen, Marshall S. Smith, and Alex Szalay. “Fostering Learning in the Networked World: The Cyberlearning Opportunity and Challenge.” Washington, DC: National Science Foundation, 2008.

Summary: Task force report designed to give NSF guidance on cyberlearning: “networked computing and communication technology to support learning.” Their focus is on using CL to support STEM education in a lifelong, customized setting- redistributing learning over space & time. The authors believe there’s a high potential now because of new technologies, increased understanding of learning processes, demand for solutions to educational problems. Some examples: Web tech & breaking down location barriers, open & multimedia educational resources, new techs. making learning affordable & accessible, cloud computing, customizable content, and an enthusiastic audience (though schools aren’t up to speed on digital techs.) Key potential problems: responsible data use/data overload, scaling technologies for large communities, how to apply software & other resources. Several issues require action: data management, open/accessible resources need to be guaranteed, NSF needs strategy of funding projects that produce resources for both education & research. They have 5 main recommendations, including a “platform perspective” (shared & interoperable designs), resources developed should be open & freely shared.

Comments: Apparently, while the public doesn’t respect education, we do like electronic gadgets- so the idea is to use these to educate people. This reference will mainly be useful for giving me a sense of the state of the field. They do ask one question that’s interesting: should we train people to work in interdisciplinary teams, or increase the versatility of individuals? (trend seems to be to work in teams…)

Links to: Finholt (collaboratories)

Thomas Finholt. “Collaboratories.” Annual Review of Information Science and Technology. 36(1): 73-107, 2002.

Summary: “Collaboratories”=collaborative laboratories or “labs without walls;” joint science work has historically depended on physical proximity, esp. science with large instruments (or specific study sites). While one answer has been residencies, the problems of this structure have remained, primarily barriers to access or research. Science has been moving toward large, complex distributed projects- can consider these a types of distributed intelligence. Collaboratories require two types of IT: increased communication + better access to instruments and data (data sharing/data viz. tools, remote-use instruments). Finholt discusses history of such projects, from “memex” concept & ARPAnet to current projects in various disciplines. These still involve a small number of participants; libraries and datasets have more use. Other lessons: people can use them sporadically and still be useful, easily integrated software is more accepted (e.g., web-based), some types of activity are naturally more collaboratory (data coll. vs. idea generation), and there are new expectations for participants. Challenges: moving from shared space to virtual space introduces new demands: must make implicit interactions explicit (e.g., pointing, gaze detection), willingness to collaborate and adopt tools is also issue.

Comments: Points out that increased communication can lead to Balkanization as well as broader communication (social exclusivity); benefits are highest for students & non-elite scientists, drawback might be these projects becoming pools for marginalized scientists (e.g., e-journals have lower status). F2F interactions still crucial for establishing contacts; meetings still important- these projects will augment, rather than replace current practice.

Links to: Howe (crowdsourcing)

Categories
exam readings networks tech design

Exam reading: “Games with a purpose”

People teaching computers to do stuff. Actually, I’ve played one of these games (though a 1-person variety). It was kind of fun

Luis Von Ahn and Laura Dabbish. “Designing Games With A Purpose.” Communications of the ACM. 51(8): 58-67, 2008.

Summary: Games with a purpose (GWAPs) involve people performing tasks that can’t be automated, e.g., image tagging, collecting facts, etc. Related to open-source software movement, non-game crowdsourcing, and gamelike interfaces in business apps. The authors first describe three categories for these games: 1) “output-agreement:” players see same input & must produce same output (e.g., give same label to photo); 2) “inversion-problem:” one player describes something, other guesses it (sim. to 20 Questions); 3) “input-agreement:” players are given inputs and must describe them to see if they have the same input or not (e.g., both describe a music clip and then guess if the other player’s clip matches yours). For enjoyable play, need to add features to these templates: time limits, scorekeeping, high scores, randomness, leveling up. They describe mechanisms to guard against player collusion, e.g., cross-checking, random matching of players; games can also be modified for n¹2 players. Games are evaluated by throughput * enjoyability (average lifetime play)= “expected contribution;” this doesn’t capture popularity or word of mouth. They point out that their examples focus on similarity/matching- need a different template for gathering diversity.

The goal of GWAPs is to capture large datasets for developing programs with advanced perceptual capabilities.

Comments: Focus here is on machine learning, rather than human (“useful computation as a side effect of enjoyable game play”-61), but one could potentially link such a system to an educational tool, for a crowdsourced informational resource.

Links to: Brown & Adler (general online learning); Howe (crowdsourcing)

Categories
exam readings politics tech design

Exam reading: “Questioning technology”

Moving on to the next book, so no extra commentary… Andrew Feenberg’s “Questioning Technology”:

Summary: Feenberg proposes a middle ground between technological determinism and the belief that technology is a neutral force: the idea that technology does influence society, but that society can also influence technological development. Tech. development can either reinforce or be used to change existing power structures; design is itself a political act, because the choice between design alternatives takes place against an implicit background of social norms and codes. In order to counter tendencies toward technocracy, Feenberg proposes a “micropolitics of technology:” localized citizen or user involvement in making decisions about development choices. He calls this “democratic rationalization.” Key elements are communication channels (between user networks), dialogue between experts and the public, and attention to industry-externalized costs/tradeoffs. Finally outlines two poles or trends for technology: concretization (elegant design, combining multiple functions in one part) and differentiation (local adaptation of flexible(?) technologies into social systems). Concretization tends to occur in objects that have “primary instrumentalization” (decontextualization, reductionism, autonomization, positioning); differentiation occurs with “secondary instrumentalization” (systematization, mediation, vocation, initiative).

Comments: Politics: differentiates between “thin” (personal freedoms, mass-media driven) and “strong/deep” (emphasis on local collective action) democracy. Options to increase public participation are townhall meetings (limited use), influencing professional societies, and public participation in planning in areas with loose government control (utilities, hospitals, land use). Philosophy: skipping over philosophical basis of his model (big q: does controlling objects violate their integrity & make them “less”?).

Links to: Johnson (user involvement with tech.); Norman (tech. design); Liu (politics of tech.); Tomlinson (envt. & development)

Categories
exam readings knowledge work pedagogy tech design

Exam reading: “Datacloud”

I often wonder about positive interpretations of the new, “postmodern” information-dense and chaotic work environment. For example, how well will this exciting new world of info-surfing as a model hold up, given recent evidence that we really can’t multitask? And there are also significant issues skipped over in most discussions of the changing work environment: the wide divergence of incomes between certain classes of knowledge workers and non-knowledge workers, the digital divide, and class stratification.

I certainly don’t know how these things are going to play out. But here’s another exam reading that doesn’t really address them head-on: Johndan Johnson-Eilola’s “Datacloud.” It’s probably a scope issue- he does at least mention these issues, but his focus is clearly elsewhere:

Summary: In this book, Johnson-Eilola tries to describe changes in the work environment occurring in information-based jobs, and how both education and computer workspaces should be changed to facilitate this new way of working. Describes standard model of how the “symbol-analytic” (S-A) workplace is becoming the new postmodern paradigm: fragmented, mobile, computerized. contingent, situation-specific solutions, under-defined goals, playful, and helping facilitate a concept of the self that is fluid and changeable. He focuses on how different articulations (“suggestions about acceptable meanings”) of technology can be sites of resistance to dominant cultural trajectories. “Articulation theory” is a postmodern adaptation of Marxism that says that subjects are constructed within social/class contexts, but that these sites of negotiation allow the subject some agency. His main focus is on how workspace design (mostly the computer interface, but also the physical space is important) can be changed to facilitate S-A work. S-A work requires the ability to navigate between complex spatial data representations, communicate at need with other workers, and be able to display some information in different, more permanent locations (e.g., whiteboards). Education spaces need to change to get students comfortable with these immersive work environments. Students also must learn how to be creative about representing and using information (rather than just using ppt or xls defaults).

Comments: Mentions the ideological nature of articulations (e.g., current word processing programs make it hard to work in a dynamically interlinked environment b/c of clunky embedding), but doesn’t go into too much detail about group-level politics. Characterizes hyperspace as linear on temporal scale, rather than as a fluid network that gives up temporality. Blogs are an example of new (book was published in 2005) emergent symbolic-analytic spaces: dynamic production sites with RSS feeds to let readers experience them in different temporal & spatial sequences.

Links to: Liu (historicizes symbolic-analytic work); Brown & Duguid (less focus on specifics of work envt.); Spinuzzi (approaches topic from network theory); Bolter (hypertext)

Categories
exam readings research methods/philosophy rhetoric tech design

Exam reading: “User-centered technology”

In this book, Robert Johnson explores technical design from a rhetorical perspective. His “system-oriented” and “user-oriented” distinction brings to mind this cartoon.

Summary: Johnson explores the relationship between technology and people from a technical communication perspective. He articulates two types of knowledge: expert theoretical knowledge and applied practical knowledge; traditionally, theoretical knowledge is valued more highly. End-users are often invisible in the design process, which can lead to problem-prone technologies. Johnson advocates getting users involved from the start of the design process, and incorporating their practical, task-based knowledge into design. He contrasts this approach (user-centered design) to system-centered and user-friendly design processes. Johnson grounds his book in a “user-centered rhetorical complex of technology:” a reworked version of the rhetorical triangle which places the user at the center; has the designer, the system, and the user tasks as the vertices; and places this relationship within concentric circles of general activities (learning, doing, producing), constraints of human networks (institutions, disciplines, community), and finally large social factors (culture, history). He emphasizes the importance of reflecting on assumptions about technical determinism when designing large projects. He also discusses the specific case of producing technical explanations for computer systems (e.g., documents should be organized in a task-oriented way). He ends by connecting the ends of technical writing pedagogy to those of rhetoric (focus on the user/audience, supposed to be working toward “the good”), and suggesting a service-learning approach for tech writing classes.

Comments: Johnson falls somewhere in between Feenberg and Norman on an axis of “politics and philosophy” vs. “design for easy use.” For non-technical writers, there are still some good design ideas here (though he does emphasize the application of his ideas to this field), mainly having user input throughout the design process, designing with specific tasks in mind, and avoiding a “design for dummies” approach.

Links to: Norman (user-friendly approach); Feenberg (phil./politics of technology); Gee (learning by doing)

Categories
exam readings politics tech design

Exam reading: “Free culture”

In “Free culture,” Lawrence Lessig looks at how copyright law intersects with new media, and the freedom of experimentation with information afforded by electronic technologies. Some timely connections to recent news events in this book.

Summary: After outlining the history of copyright in the U.S., Lessig describes recent movements by content distributors to expand scope of copyright law. Online content includes traditionally non-commercial culture and can easily be monitored, leading to a trend for copyright holders to encroach upon traditionally free/fair use of intellectual property. Lessig discusses the tremendous possibilities for cultural creativity using electronic media, as well as industry (MPAA, RIAA) attempts to quash creative efforts. He outlines four ways that intellectual property is regulated: laws, cultural norms, market forces, and by the distribution architecture itself. His thesis is that the law should change to balance public interest with the interests of copyright holders. Currently, law, the market, and distribution architecture are all being used to restrict fair use of copyrighted materials (e.g., extending copyright terms, software permissions for use, software making everything regulatable, media consolidation), while cultural norms are becoming more accepting of piracy and illegal activity. For Lessig, the ideal is limited-term intellectual property copyright, followed by either release into the public domain or extended copyright under a system that makes it easy for later users to get permission to use works. He also discusses positive developments, including open-source development and Creative Commons licensing.

Comments: Lessig offers several examples of historic and current conflict between content producers and later users (e.g., radio, p2p file sharing, documentary filmmaking, news archiving), which provide a good historical grounding. He likens the current situation to Prohibition (stifling laws with little public support leading to widespread illegality).

Links to: Liu (discusses uses of “free” info); Feenberg (politics/philosophy of material technology)