Categories
environment rhetoric

A question of interpretation

It’s been a month since the Deepwater Horizon oil spill site was temporarily capped. So what is going on with the oil from the spill? Various media outlets have recently been reporting that about 75% of the oil has been “dealt with,” meaning that only about 25% remains a threat to wildlife. Pretty good, right? These numbers are coming directly from a NOAA report, which categorizes the current state of the oil into seven categories:

NOAA estimate of oil status: Aug. 2, 2010
NOAA estimate of oil status: Aug. 2, 2010

Well, it turns out that what “dealt with” means is a question of interpretation. “Dealt with,” to me, implies “taken care of,” “no longer a concern,” and “under control.” I suspect that the phrase carries those connotations for many people. Organizations reporting on the NOAA report are including “Evaporated or Dissolved, Naturally Dispersed, and Chemically Dispersed” categories in their 75% “dealt with” calculations, implying that these categories of oil are no longer a problem. (This interpretation is encouraged in the NOAA chart, which highlights that oil in some categories is being naturally degraded.) But what do these categories actually mean?

“Dispersed” oil is oil which has been broken into tiny little droplets. In the case of chemically dispersed oil, these droplets are coated with chemical dispersants; in the case of naturally dispersed oil, droplets have been broken up through wave action or some other means. It is still in the water-it hasn’t magically disappeared- but it’s in tiny little drops, rather than large slicks or goopy tar balls. Think about when you wash your dishes: the dish soap breaks up the greasy residue on your frying pan so the water can wash it away, but that grease is still in the water that goes down the drain. “Dissolved” oil has been broken into even smaller pieces, but it, too, is still in the water- it’s just impossible to see.

Scientists from the Georgia Sea Grant program now say- and I have to agree, according to how I understand the phrase- that this dissolved and dispersed oil is not “dealt with.” In fact, most of it- up to 79% of the spilled oil- is still in the ocean, creating problems (pdf of press release). Here’s the Sea Grant analysis of the situation (note that the Sea Grant scientists do not include the “direct recovery” oil from the NOAA chart in their analysis, since it was not actually spilled into the water, so their numbers differ):

Georgia Sea Grant estimate of oil status: August 16, 2010.
Georgia Sea Grant estimate of oil status: August 16, 2010.

So, what is going on here? The discrepancy about the amount of oil remaining in the water seems to largely hinge on how these different categories of oil are interpreted. (The Sea Grant and NOAA scientists also disagree on the calculations, which creates additional disagreements in the two estimates.) The NOAA report is optimistic in tone and interpretation, but note the highlighted text:

“A third (33 percent) of the total amount of oil released in the Deepwater Horizon/BP spill was captured or mitigated by the Unified Command recovery operations, including burning, skimming, chemical dispersion and direct recovery from the wellhead… An additional 25 percent of the total oil naturally evaporated or dissolved, and 16 percent was dispersed naturally into microscopic droplets. The residual amount, just over one quarter (26 percent), is either on or just below the surface as residue and weathered tarballs, has washed ashore or been collected from the shore, or is buried in sand and sediments. Dispersed and residual oil remain in the system until they degrade through a number of natural processes. Early indications are that the oil is degrading quickly.”

This one-line caveat is an important one, because it implies that these types of oil are not “dealt with.” Yes, some of the oil will be degraded by microbes naturally, but a large portion of the oil has settled into deep underwater plumes where there is little microbial activity, and another chunk is buried in anoxic wetland sediment where it’s unknown how quickly it will be degraded. This oil is certainly not “dealt with” today, and other media outlets are beginning to pick up on that fact. Nor will it be for months, years, or possibly decades. While it’s out of sight, it shouldn’t be out of mind.

Categories
exam readings information representation

Exam reading: “Orality and literacy”

I felt like I needed to do a re-read today and take a little bit of a break. “Orality and Literacy,” by Walter Ong, is about the ways that writing technologies have affected human ideas and expression. It’s a book I’ve read in two classes thus far.

One of the things I had fun with previously with this book was an assignment to represent the different developments discussed in it in timeline format. Since Ong primarily focuses on European history (though he does provide examples from other cultures), I wanted to include developments from other regions, and situate the whole project within the entire timeline of human history. Part of my reason for this is my background with Hawaiian culture- a culture in which the transition from orality to literacy has happened fairly recently. It required a lot of thought, some big changes in scale, and several hours on Illustrator, but I’m pretty happy with it.

Here’s a preview of the timeline:

Ong timeline preview

Here’s the full pdf. (Dates are approximate.)

And here’s my summary of the book:

Summary: Taking oral culture as a baseline, Ong explores the impacts that writing technologies (first script, then print, and now electronic technologies) have had on human expression, patterns of thought, and society at large. For example, Ong states that oral cultures were largely communal, and that storytelling relied on creative and situation-dependent groupings of stock formulas and characters. Learning was rooted in apprenticeship and daily practice, and naming gave people power over objects. The technological shift of writing led to solitary contemplation of ideas, complex storytelling, abstract learning, and using names as tags in text “containers.” Print enhanced these developments, and led to a more sophisticated use of visual space, organization of information (e.g., indexing, glossaries), and dominance of a few writing systems. Finally, electronic technologies are continuing some print trends (e.g., spatializing information, a high level of text processing), but are also promoting a turn to a “secondary orality” of participatory expression. Ong addresses several other issues important to cultural scholars, such as a shift from aural to visual sense primacy that writing helped promote, the rise of the unitary self, and the incompleteness of a sender/receiver model of communication.

Comments: This is a foundational work that informs the entire field of T&T, and contains a large amount of material (e.g., effects on memory, theoretical differences between writing systems-which I’ve glossed over). Ong states that most literary historians of his time approached oral cultures from an unconscious written perspective; he tries to point out the biases that can creep in to scholarship from this perspective, which are important to keep in mind. I remain not entirely convinced that there actually was a dramatic aural/visual shift of the sort that Ong (and other theorists) proposes, but that’s only one of the main themes of this book.

Links to: McLuhan (communication media effects); Benjamin (changing technologies’ effects on perception of works); Headrick (communication media); Turkle (identity & writing technology); Murray (narrative and technology); Bolter (remediation during media shifts); some connections to visuals, info organization

Categories
exam readings networking networks

Exam reading: “Network”

Back to my exam readings in this post… “Network,” by Clay Spinuzzi, is an account of the operations of a telecom company: its development, problems, how it operates successfully, and how work that seems simple from the outside is really quite complex. A large part of the book is dedicated to exploring two theories describing networks (about which, more below).

While not long, it is definitely a dense book- not excessive repetition, but it did take longer for me to finish than I had estimated. I’ll read some short texts next so I can feel better about crossing things off my list…

Summary: In this book, Spinuzzi uses two theories to describe the structure and function of a telecommunications company: Activity Theory (AT) and Actor-Network Theory (ANT). He chose a telecom company as an example of the highly decentralized type of knowledge work that is becoming more common in modern organizations. AT is a theory of learning and development through interaction, based largely on Marxist dialectics; ANT is a descriptive theory that focuses on how shifting relationships among actors in a network help define those actors, based largely on rhetoric. Spinuzzi spends a lot of time exploring both similarities and differences in these two theories, and giving examples of how they apply to situations at the company. Four characteristics of highly-networked organizations are: members have heterogeneous skills/tasks, members are multiply linked, transformative shifts can change the goals of the network, and certain processes within the network are “black-boxed” (appear to be simple from the outside when they are, in fact, not). Texts help connect the different actors within the network in three ways: they are stable traces of (ephemeral) ideas, the structure of genre helps organize unfamiliar information into familiar patterns, and they act as boundary objects among actors operating within the network in different contexts. According to Spinuzzi, each of these theories can be used to describe different aspects of “net work,” although he concludes that AT (with its developmental focus) is most appropriate to use in similar future studies, if dialogue and rhetoric (strengths of ANT) are taken into account.

Comments: Although there are some interesting political implications of knowledge work here (e.g., worker segregation by education, the “homework economy,” the necessity of continual retraining), the most immediately useful aspects of this book for me will probably be the focus on learning in networks and how texts can function in shaping networks. There is a lot of material in this book, and it’s also useful as an introduction to AT and ANT that is grounded in specific examples.

Links to: Tomlinson (theoretical aspects of networks); Haraway (cyborg identity, workers’ need for constant learning); Brown & Duguid (information networks)

Categories
random travel

Road Trip: Trees

Our 20-or-so-hour road trip from Ithaca to Orlando (not counting stops) took us through several different ecoregions: 11 of them, according to the EPA classification (see map below). We also went through several parts of the Appalachian Mountains, from the northern Plateau, through the Ridge and Valley Provience in PA, along the Blue Ridge Mountains, down the Piedmont, along the eastern Coastal Plain, and finally onto the Great Sandbar (Florida).

Ecoregions of the Lower 48

The Appalachian range is a venerable, ~480 million year old mountain chain. However, with great age comes great erosion. I like my mountains dramatic, rising abruptly from the plain (or ocean), not so much a series of lumpy big hills. Although living in Florida, I shouldn’t complain. But there was another aspect to the scenery that was less than thrilling to gaze upon for hour upon hour: the trees.

While this is oversimplifying, most of the eastern U.S. used to be covered with hardwood forest. While most of that forest was logged out and used for farmland starting in the 1700s, large portions of it are now reverting to forest because agricultural operations have been moving farther west. This means that secondary forests are springing up, though these patches are broken up by remaining farmland and urbanized areas.

While there are some fairly big differences between the ecoregions we drove through, they mainly fall into one large category: eastern temperate forests. So guess what most of the scenery from New York to Florida involves? Trees. Trees on hills, trees on plains, some farms surrounded by trees, and cities and suburbs with no trees. The occasional stream or river, overhung by …trees. And mostly broadleaf trees: oaks, elms, maples, etc. Lovely species all, but not terribly exciting to look at after about hour number 10.

Most of my distance driving has been done either in the plains, western U.S., or on Hawaii (assuming 2 hours at a time counts for distance…). Granted, the plains are none too thrilling in places (corn or soybeans?). But at least you can see for long distances- none of this broadleaf brushy stuff blocking your view of everything more than 10 feet from the road (I’m talking about you, I-26 through South Carolina). No distant mountains to gauge your progress against, very few dramatic changes in elevation, and always this leafy temperate forest underbrush. We were glad to get back to Florida, where even though half of what you see on I-95 is pines and palmettos, there’s a lot of short-distance variation- saw palmetto, pine flatwoods, prairie, and marsh, all determined by very subtle elevation and moisture differences.

Categories
museums random travel

Road Trip: Trilobites, Tiktaalik, and Trek

Our drive from Ithaca back to Orlando took a bit more than 20 hours. Driving north, we’d broken it up into two overnights (Charlotte, NC, and Scranton, PA), with a stop in Harper’s Ferry, WV. We decided to take it a bit slower on the way back, stopping just outside Philadelphia (taking the train into town to do some sightseeing), near Reston, VA, and finally just south of Charlotte in South Carolina. That worked out better psychologically, because we drove less and did more fun stuff along the way.

In Philadelphia, we visited the Academy of Natural Sciences, a mecca for diatom studies 🙂 While not the largest natural history museum I’ve visited, it did have some interesting and well-crafted displays. There was clearly a large emphasis on interactivity in the newer displays, though they also have some more traditional dioramas with stuffed megafauna. It was interesting to see three eras in museum philosophy represented in the same building. At the entrance, there was a sort of “curio cabinet” display of a mix of pressed plants, stuffed animals, and fossils presented without context in a series of cubbyholes, typical of early museum displays. Next, more modern dioramas situate stuffed animals in the context of the plants and scenery of the ecosystems in which they are (or were) found. Finally, there were the more (postmodern?) interactive and hands-on exhibits, like a glass globe which could show CO2 emissions, temperature, sea levels, continental drift, etc. over time, depending on what the user selected. I’ve always seen the more interactive exhibits as more part of science museums than natural history museums (the latter usually being attached to active research institutions so less a “learn about electricity” sort of emphasis than a “learn about the ecology of our river”).

(The trilobite and Tiktaalik from the title of this post came from the Academy. Several trilobite fossils were on display, as well as a cast and recreation of Tiktaalik. We also bought a fossil trilobite and Tiktaalik poster at the gift shop. Trilobites are an extinct group of arthropods who lived probably from 550 to 250 million years ago- a very long time span! They looked somewhat like big isopods (do not click on this link if you are afraid of giant bug-like critters), though weren’t closely related. Tiktaalik is an extinct fish from the Devonian period (~375 mya), with many features of early amphibians. It is one of a series of supposedly “missing links” between fish and amphibians that creationists like to pretend don’t exist.)

In Philadelphia, we also went to the Museum of Art. This is a huge museum, and we didn’t really have enough time to see everything because we wanted to get back on the road. The highlights I thought were interesting were several reconstructions of buildings in different rooms: a Japanese Buddhist temple and teahouse, a Chinese manor house entry and Buddhist temple, an Italian cloister courtyard, part of an Indian Hindu temple, and a European chapel. Very different experience being able to walk into a room and be surrounded by the works, rather than just viewing them on the walls. There were also a lot of paintings, many famous (Van Gogh’s Sunflowers, a bunch of Monets, that sort of thing…)

The next day, we stopped at the Udvar-Hazy Center in Dulles, VA, part of the National Air and Space Museum. This is essentially a series of giant Quonset huts full of aerospace artifacts: planes, helicopters, replica satellites, missiles, and… the Enterprise! The space shuttle, not the starship, but still, it was pretty cool (and the Trek of the post title- hey, I did need another “T” word…). After having been to Kennedy Space Center, I was probably less impressed with the various space-related artifacts than I would otherwise have been, but KSC doesn’t have a shuttle in their museum. More sobering were the various missiles on display, as well as the Enola Gay (which is probably the single artifact responsible for the greatest number of human deaths that’s on display pretty much anywhere in the world).

Overall, it was nice to be able to catch a few museums along the way. Philadelphia would be interesting to visit for a longer period- we didn’t get to see any of the really historic areas. We also stopped in Savannah (GA) for lunch on our last day driving, and walked a bit in the historic district- it would be cool to spend a weekend there some time. Though maybe not in the summer…

Categories
exam readings information representation visuals

Exam Reading: “When Information Came of Age”

It’s been a while since I’ve posted, since I’ve been wrapping up an internship at the Cornell Lab of Ornithology. I’ll post about what I’ve done there later… For now, I’ve got a new book summary. “When Information Came of Age”, by Daniel Headrick, was actually interesting to read at this point. Circling back to the internship (which I really will discuss later), I’ve been dealing with issues of representing information about bird families in different formats. This book discusses the histories of information systems that were a big part of this process. But more on that later…

Summary: Headrick proposes that the current “Information Age” is only one of many historical information revolutions; in this book, he focuses on the information revolution of the Enlightenment. He outlines five categories of information systems: classifying/organizing, transforming, display, storage/retrieval, and communicating. His thesis is that developments in these information systems in this time period, coupled with subsequent technological inventions, laid the groundwork for the Information Age. During this period, demographic, cultural, political, and economic changes helped create an information build-up that could only be made sense of by inventing new information systems. So, for example, scientific nomenclature and classification systems were developed that suggested explanations for phenomena (e.g., the chemical classification system suggested possible new compounds). Statistics were used to transform demographic, political, and economic information that was beginning to be collected. Visual information displays (maps, graphs, and thematic maps) were used to display large datasets efficiently and in an easily-recalled manner. Cross-referenced dictionaries and encyclopedias were successful at disseminating current, easy-to-access information to the general public (in contrast to dense, thematically-linked former formats). Postal and telegraphic systems (visual and electric) were devised to transmit messages; these went from private messenger services to restricted government systems to more open government systems.

Comments: This book gives a good overview of the development of information systems, though some of the chapters are more comprehensive than others. His emphasis on information systems, rather than previous technologies that facilitated them (e.g., the printing press) or subsequent technological innovations, was an interesting choice (though apparently he’s addressed later periods in other books.) This would probably be a useful book to use in a History of T&T course. While heavy on the names and dates, it covers a really interesting period of history (see my Blituri project, and I will add that the Baroque Cycle is a mostly-excellent series set right before this period :).

Links to: Tufte (information representation); Benjamin (technologies of representation); Ong (social and technical aspects of changing representational practices)

Categories
exam readings hypertext research methods/philosophy

Exam reading: “Radiant textuality”

This is a summary of “Radiant Textuality,” by Jerome McGann. The book stems from the author’s work with applying digital tools to analysis of both literary and visual works. One of his major projects is the Rossetti Archive, which indexes the works of Dante Rossetti, an author/painter. Since my interests don’t lie in literary analysis of this sort, the things I pulled out of this book may not be McGann’s major points of emphasis.

Summary: In this book, McGann explores the applications of digital tools to critical analysis and interpretation of texts. He advocates a “quantum” model for textual analysis, which recognizes that textual interpretations are inherently variable. This method of analysis is performative, rather than traditionally interpretive/hermeneutic. He outlines two major methods for doing this, both of which can be aided by digital tools: 1) “deformations”-deliberately altering words or structure (using filters, in the case of images) to let you make out the underlying structural rules of the text; and, 2) using essentially a role-playing game method (“Ivanhoe Game”) to explore multiple possible constructions of the text. Rather than being vehicles for transmitting meaning, it’s more important to consider texts to be a set of algorithms that enable critical, introspective thinking about the text. Texts contain both graphical (design) and semantic signifying parts, and it’s the latter that have been the subject of traditional interpretation, where he focuses on the former. These “invisible” design elements (organizational & linguistic) constitute a “textual rhetoric” or bibliographic code. Studying “deformations” is useful because it gets at what the text doesn’t do; this lets us see the underlying textual rhetoric.

Comments: McGann’s “radiant textuality” of the title refers to writing (and other types of text) in which the medium is the message and promotes introspection, rather than writing for which the purpose is information transmission. He states that his methods are most appropriate for creative/poetic texts, rather than expository writing, because these texts deliberately lean towards the creative/design end of the spectrum (rather than the expository/”scientific” end). My interests lie on that other end of the spectrum, so I don’t really see myself putting his methods into practice that often. On an aside, the arrangement of this book could be a series of variations on a theme, exploring the same ideas  in various ways in a set of essays.

Links to: O’Gorman (Blake; informational vs. design elements of texts), Hayles (how interaction bet. user & interface co-creates [Hayles] or explores [McGann] texts), Tufte (questions form/design of texts), possibly Sullivan & Porter (quantum poetics rejects set methods of analysis or single interpretations)

Categories
exam readings visuals

Strategizing and mapping, part 2…

For my candidacy exams, I’m focusing on two subject areas: public understanding of science (PUoS) and informatics of community science (or, how communities use cyberinfrastructure for science projects).

Both of these areas contain a wide variety of concepts, and there are many connections between the two. I’ve created a concept map to help me organize how these ideas fit together. You should be able to click on it for a legible version.

Concept map for ideas connected during my dissertation research

What I’ve tried to do here is connect up the concepts from various readings to some of the key ideas. Three key topics on the left (in hexagons) primarily come from the PUoS literature: participatory genres of science learning, more traditional “communication” genres, and some key concepts that are critical to address in order to have public understanding of science. These three represent important areas of focus in the PUoS literature.

Jumping to the right, the oval for PUoS is what I plan to do research on. I’m interested on how we can use online tools to improve PUoS. The general idea with this map is that we use the two diamonds (and connected ideas) in the center to get to this point. The topics in the diamonds are primarily from the informatics/community literature: physical and social components of community science networks, and theories for understanding these networks. These topics either mediate or theorize specific approaches to PUoS, so we can get to an endpoint where we can evaluate PUoS. Hopefully this will all make sense.

A good concept map shouldn’t need this much explanation, but this is a working draft, mainly intended as a tool to help me put a wide variety of readings together into some sort of coherent form. What I’d ideally like to do with this is connect up these concepts with the readings, which will add on another layer to this diagram. At least, that’s the plan.

Categories
exam readings

Strategizing* and mapping, part 1…

One of my major projects at the moment is getting together reading lists for my dissertation candidacy exams. This means looking at two entire (sub)fields of research (as well as the core T&T field), trying to distill those fields down into a set of key publications, and then immersing myself in the ideas contained therein. It’s a big process, but I’m  optimistic about the way it’s turning out.

I’m approaching this differently than the way I generated my reading list for my Master’s thesis, which was not really a focused approach. I remember spending a lot of time in the library (yes, back in the day when journal articles were actually kept in these paper things called “journals” that you went to physically photocopy) looking up the most recent papers on stream algae. Then reading those papers, looking for relevant information, and backtracking those references to find older papers. I suppose this is a pretty typical way to approach research for someone who doesn’t have a well-thought-out research strategy.  It’s not that I didn’t know how to use database for keyword searches, but I was definitely doing some flailing while trying to grapple with the amount of information out there.

That experience with inefficiency taught me that a good research strategy is not something that’s necessarily going to drop out of the sky. There’s a couple of key things I’ve tried to pull out of this experience that have been helpful for me, at this point in my graduate experience:

  • I have a better handle on the underlying subject areas than I did as a beginning Master’s student. Which is important, even though it’s obvious- it leads to the next few things. But the key thing here is being widely-read before beginning focused research.
  • Being able to evaluate scale and scope of a field- helps establish the boundaries and key concepts to focus on.
  • Being familiar with the major arguments/points of contention- key concepts and threads to be aware of.
  • Since I plan to do an empirical study as part of my research, it’s been helpful to talk to people outside of my institution who are doing on-the-ground research. Different perspectives, and it’s useful to see what current trends in the field are.
  • …On the subject of current trends, being aware of what funding agencies are focusing on in is probably a good idea, career-wise. (To clarify: choosing dissertation topic to a specific funding program=bad; shaping research language to fit topic into a broad area of interest=good? This is certainly a topic worth further exploration.)
  • Also, obviously talking to people on my committee has been important in shaping this process.
  • Evaluating impact of specific papers. I’m not sure to what extent article impact factors is a sciences vs. humanities concept, but certainly the level of interest in the specific ideas brought forth in a publication is indicative of the centrality of it to a field (or at least a specific debate in that field).
  • Keeping in mind that this is not all the reading I’ll be doing for my project. Leaving something out does not mean I won’t be using those ideas in the future.

This is pretty basic stuff and not really all that original, but I thought I should try to be introspective about it, with the idea that at some point in the future I might be giving other people advice about this process. It looks like the key things to do are read widely before beginning research, define the scope of the (sub)field, and get multiple perspectives. I’m sure I’m missing some important points here, though. And I’ll probably have different things to emphasize as I make my way through this process.

Well, this turned into a longer post than I thought it would! Next time, I’ll talk about my attempts to use concept maps to actually organize this information…

*”Strategizing”: not a real word, but fun to use…

Categories
exam readings networks tech design

Exam Reading: “Greening through IT”

This book, “Greening through IT” by Bill Tomlinson, is one of the newer ones added to the T&T core reading list, and addresses one area that I think the core list as a whole ignored previously: the broad-scale material basis of electronic technologies. While many of the theorists covered in the program emphasize the connections between mind and materiality (e.g., the physical experience of interacting with a computer is part of what makes reading online different from reading a book), no one thus far has addressed the broader ecological implications of these new technologies.

I would venture that most theorists approaching the T&T field from a critical theory perspective are (understandably) not aware of the ecological sustainability issues surrounding electronic tech- for example, electricity use, e-waste, programmed obsolescence of devices. Most authors focus on the social/philosophical implications of new technologies, and there’s a definite assumption overall that we will be able to continue to physically make and use these technologies in the future, without too much consideration of natural resource limitations. Even the authors who focus on “materiality” of technology focus on the individual user-machine interaction.

So there’s a need in the program for attention to these issues (which are a main concern of mine, given my background in ecology). I think Tomlinson’s book does a decent job of addressing them. It’s not the perfect book on this issue for this program- I can see some of the more theory-centered students discounting it because of its low theory quotient (and that apparently annoying “evidence is used to support my theory, not contest it” thing). However, it does provide a needed perspective to the program, and I’m not sure what an alternative text that addresses these issues might be…

Summary: Discusses potential uses of information & communication technologies (ICT) for environmental sustainability. Tomlinson lays out a framework for “extended human-centered computing” (EHCC), which requires consciousness of scale (temporal, physical, complexity) when analyzing problems, guides system development in green directions, and compares technologies in a range of time/space/complexity scales to identify gaps not being addressed. There’s detail about various environmental problems, social barriers to change, and how we can use IT to address both of these large areas. There’s a lot of detail, but it basically boils down to expanding our sense of/ability to cope with large time/space/complexity scales. Touches on three orders of effects of technologies, which need to be considered: 1st (direct effects), 2nd (specific impacts on other economic sectors), and 3rd (general social or cross-industry effects). He breaks down his discussion of pathways for using green IT ides into industrial, educational, personal motivation, and collective action categories (a lot of detail is outlined for each category). He presents several case studies in education, personal data tracking, and collective action, and discusses how each worked (or didn’t).

Comments: This book was heavy on examples, and perhaps short on theory. It would have been nice to see how the EHCC framework was used specifically to address gaps in scale (is there a heuristic for applying it in specific cases?). In the case studies, it seemed that the more controlled the environment, the better the technology worked for its designed purpose, e.g., the museum display worked better than the online programs that utilized crowdsourcing. I’d like to see more research on whether crowdsourcing/networking actually works for more than just getting Betty White on SNL (granted, this area of research is in its infancy). It’s also possible that more advanced research in museum displays/childhood education in general are responsible for this effect. The question seems to be how to get adults to buy into some of these ideas. Some of the cited examples (Indian fishermen) seemed to be more effective, though that could be because he was emphasizing the positibes. The context/discussion of the theory of punctuated equilibrium was odd-with his emphasis on the importance of metaphors that work on more than just the surface (an idea I feel strongly about), this bugged me.

Links to: Norman (technology design), Feenberg (applied case of design with technical and social ends in mind), Spinuzzi (objects & network formation?), Johnson (design for a purpose, though T. specifically focuses on the larger system rather than the user- the opposite direction from J?)

Edited 8/28 to add links, correct Johnson’s name.