Categories
exam readings research methods/philosophy

Exam reading: “Digital libraries”

Howard Besser’s take on “The Past, Present, and Future of Digital Libraries.” Online here.

Summary: Besser discusses both the history and functions of digital libraries. At a minimum, traditional libraries provide access to source material, contextualization, and commentary; digital tools add coordination of multiple archives and facilitation of text analysis and searching to this list. There are four core components that the traditional library provides: a physical space, mission to serve the underserved, a location for continuous education, and a guarantee of public access to collections. Within these components, there are several features that Besser characterizes as part of the traditional ethics of library practice: stewardship, stability, public service, information privacy, equal access, and providing a diversity of information. Digital libraries began as just collections, but are now moving into adding other traditional library services like curation. Besser believes that to be “true” libraries, digital libraries will need to incorporate these ethical standards into their missions as well (key digital issues are connectivity among collections, access/usability, and protecting privacy concerns.) While he has a large focus on traditional library ethics, Besser also discusses the importance of standards for interfaces, best practices, user authentication, and metadata (which he divides into descriptive, discovery/search, structural/navigation, administrative, version identification, and longevity types).

Comments: Good summary of issues, and provides an ethical perspective on the movement to online archiving.

Links to: Cohen & Rosenberg (archiving & preservation issues); Jensen (publishing perspective); Burnard, et al. (TEI/standards)

Categories
exam readings information representation research methods/philosophy

Exam reading: “Electronic textual editing”

Electronic Textual Editing” describes the main data archiving standards effort for the humanities. It’s not really a dynamic read- how thrilling can a collection of essays on XML and database construction really be? But it’s a useful overview of the TEI:

Summary: A collection of essays dealing with editing and archiving issues with electronic texts. Focuses on the Text Encoding Initiative: project to create best practices and markup languages (SGML, then XML) for the humanities. It can be broken into three main parts: general guidelines for creating and digital editing of scholarly editions, case studies and lessons for editing both older and modern texts, and specific technical methods (e.g., digitizing documents, dealing with character encoding and markup). For scholarly editions, important considerations are accuracy in documentation and thorough inclusion of text variants. Digital editions/collections allow researchers to create quite accurate versions of a text (e.g., scanned copies), collect multiple versions of said document, and dynamically link them all. The functions of markup language include labeling (and linking) sites of variability among texts, and replicating structural/layout elements in electronic text versions of originally print documents. A few of the case studies had some interesting points. The digitizer of the Canterbury Tales points to the importance of having explicit principles for transcription before starting, and discusses how reading an electronic version of the text changes the editing & reading experience. For the creator of an electronic Thomas Edison archive, the major task seemed to be developing a good database to link text- & image-based documents. For poetry digitization, it was key to pay attention to both words and layout.

Comments: Glossed over the detailed technical essays, and focused on what I thought were the most salient points. Most authors were quite keen on XML for its formatting abilities, which I’ve used derivatives of (XHTML & CSS). As I’m not involved with archiving or creating digital editions, this was more of an overview of this area of T&T.

Links to: McGann (TEI, digital archives); Headrick (classification systems in general)

Categories
exam readings identity information representation visuals

Exam reading: “Simulacra and simulation”

Jean Baudrillard’s “Simulacra and Simulation” is popular among a certain set of postmodern enthusiasts, including the Wachowski brothers. I won’t go into how this book influenced The Matrix- you can go elsewhere for that.

Here’s my summary:

Summary: Baudrillard’s main concern is for cultural impacts of mass/electronic media. Our culture of simulation has progressed to the point that simulation no longer refers to the real world- it is “hyperreal.” Reality has been replaced by nested systems of sighs, all referring to one another; he calls this “precession of simulacra.” Images began as reflections of reality, became masks for reality, then masks that mask the absence of reality, and are now not related to reality (but to other images). Examples include the Lascaux caves (we now only experience the simulation of Lascaux II) and Disneyland (an imaginary world, set up to mask the fact that America is itself only a simulation, in which people take on roles but never truly interact). In some sense, electronic media make everything a simulation (e.g., political scandals mask broader truths about the capitalist system, nuclear deterrence and how MAD means no one will ever have to use nukes since we know what will happen). He also discusses historical movies, which are more “real” than the reality was; history is no longer an active force-all cultures are congealing into one, and all that’s left is nostalgia. Takes “medium is the message” to extreme: e.g., culture (content) in museums is merely a support for the medium to operate (the visitor experience)- the point is to have visitors, not transmit the culture; also, advertising/propaganda are becoming the dominant features of mass media- publicity is all that matters (not ideas or meanings).

Comments: Briefly discusses how cloning & medical research are another expression of mass-production (reproducible, without aura)- rather than taking a cyborg approach, he links this phenomenon to Benjamin’s ideas. I’m glossing over education- says the only ways for non-conformists to not conform are either dropping out entirely or committing terrorism.

Links to: Benjamin (mass-produced society; body); McLuhan (medium)

Categories
exam readings knowledge work politics

Exam reading: “Social life of information”

Do ideas meet, flirt, and spawn off cute little baby ideas? Is Google a speed dating service between your computer and the object of your search? Is your credit card having an affair with that sexy Brazilian computer it met while you were on vacation?

Sadly, this book answers none of these questions. “The Social Life of Information,” by John Brown & Paul Duguid, is about the perils of techno-cheerleading in the knowledge economy. “Social life” refers to the fact that there is a strong social context to information; we are not really floating in a sea of decontextualized data.

Summary: Brown & Duguid lay out some considerations that should be made in order to avoid “tunnel design” (a focus on information & ignoring social/material context) for businesses & technologies. They believe that social interaction is crucial for businesses to function & technologies to be used effectively. They take several myths of the information age to task: “endisms” (end of politics, the press, etc.), reframing everything in an information-processing perspective (e.g., universities as information-transmitting centers), all businesses will be “flattened” & disaggregated, etc. They address the trend toward decentralized & work-at-home offices, emphasizing that social interaction is needed and we need more in our work environment than just a computer (desk, tech support, post-its, etc.) They discuss the network structure of business: both stepwise processes and the lateral links that let workers share practices are important; links within companies (e.g., between specializations) and between companies (in professional/discipline networks) are also crucial. They make a distinction between knowledge (contextual, requires a knower, needs assimilation for meaning) and information; there are also two dimensions of knowledge: explicit (knowing that) and tacit (knowing how). A big part of their discussion is communities of practice and how members learn through day-to-day interactions with more experienced members; distributed businesses had better have really good communication networks in order to facilitate even an approximation at this type of interaction.

Comments: Skipping discussion of bots & what they can & can’t do (technical capabilities, legal/ethical issues), the future of paper (yep, we’ll still need it). While their discussion of how universities will change (distributed systems with a mix of online components and physical centers) mentions that physical interactions in traditional universities are important, I think they downplay the importance of these interactions. For example, you can’t get a thorough education in a technical field or science without lab or field work; I don’t think their suggestion of internships or brief stints at research centers would work here.

Links to: Liu (knowledge work); Spinuzzi (info networks); Johnson-Eilola (knowledge work envt.)

Categories
exam readings identity knowledge work transparency

Exam reading: “Essential McLuhan”

This book includes selected works by Marshall McLuhan, a popular figure in cultural criticism:

Summary: McLuhan’s main thesis is that the media by which we communicate are powerful shapers of psychology and culture. Media are our ways of extending human sense organs into the environment. When new media technologies are introduced, the levels of different senses used by people shift (e.g., writing started to emphasize vision, and eventually print enabled logic, 3-D perception, and the individual ego). There’s a fundamental difference between vision (acts to separate people from their environment) and all other senses (immerse people in their environment). Non-literate cultures (he includes those using non-alphabetic writing in this group) exist in primarily auditory, tribal societies, while alphabet-using cultures are visual and civilized. Electronic media are in the process of making the entire world auditory and tribal; these media affect feeling, not thought. Media are more important than the message, in terms of influencing society. Even visual media are changing-the juxtaposition of multiple visual elements creates a symbolic landscape, in contrast to single linear chains of argument & evidence. Holistic/systems thinking is the new paradigm; we will no longer need specialists, because generalists immersed in the new sensory paradigm will be able to figure everything out.

Comments: McLuhan’s formulation of the relationship between media use & culture is strongly deterministic. Distinguishes between “hot” (aural, “hyperesthetic,” demand low participation by audence) and “cool” (visual, detached, demand high audience participation) media, but contradicts himself about which technologies are which and where writing fits in- I don’t find this formulation convincing (I’m sticking with the vision/other senses distinction, which at least he’s consistent about). Uses some questionable (from a sociology perspective) interpretations of examples from Africa and China to support his ideas about alphabetic literacy. McLuhan’s style of writing and futuristic bent is horoscope-like: it’s easy to pick out predictions that seem to have come true while ignoring those that have not.

Links to: Feenberg (technological determinism); Ong (more scholarly analysis of media & representation); Brown & Duguid (knowledge work cheerleader)

Categories
exam readings politics tech design

Exam reading: “Questioning technology”

Moving on to the next book, so no extra commentary… Andrew Feenberg’s “Questioning Technology”:

Summary: Feenberg proposes a middle ground between technological determinism and the belief that technology is a neutral force: the idea that technology does influence society, but that society can also influence technological development. Tech. development can either reinforce or be used to change existing power structures; design is itself a political act, because the choice between design alternatives takes place against an implicit background of social norms and codes. In order to counter tendencies toward technocracy, Feenberg proposes a “micropolitics of technology:” localized citizen or user involvement in making decisions about development choices. He calls this “democratic rationalization.” Key elements are communication channels (between user networks), dialogue between experts and the public, and attention to industry-externalized costs/tradeoffs. Finally outlines two poles or trends for technology: concretization (elegant design, combining multiple functions in one part) and differentiation (local adaptation of flexible(?) technologies into social systems). Concretization tends to occur in objects that have “primary instrumentalization” (decontextualization, reductionism, autonomization, positioning); differentiation occurs with “secondary instrumentalization” (systematization, mediation, vocation, initiative).

Comments: Politics: differentiates between “thin” (personal freedoms, mass-media driven) and “strong/deep” (emphasis on local collective action) democracy. Options to increase public participation are townhall meetings (limited use), influencing professional societies, and public participation in planning in areas with loose government control (utilities, hospitals, land use). Philosophy: skipping over philosophical basis of his model (big q: does controlling objects violate their integrity & make them “less”?).

Links to: Johnson (user involvement with tech.); Norman (tech. design); Liu (politics of tech.); Tomlinson (envt. & development)

Categories
exam readings hypertext identity transparency visuals

Exam reading: “Writing space”

I’ve probably read at least parts of “Writing Space,” by Jay David Bolter, in three different courses so far. It’s clearly been an influential book in the T&T field (though of course some authors love it, while others use it to argue against):

Summary: Bolter explores the ways in which digital media are changing traditional “writing spaces:” the material & virtual fields of writing that are determined by both technology and the ways it’s used. One important way this happens is through remediation: a new medium taking the place of an older one while borrowing its conventions. For Bolter, one of the reasons new media are adopted is that they bring a greater sense of immediacy, derived from either increased transparency of the medium (“looking through”) or increased hypermediacy (awareness of the medium; “looking at”). Bolter focuses on the ways that the Internet, particularly hypertext, remediate older technologies (e.g., linking is a rhetorical tool that allows associational (non-linear) expression; lack of closure; increased participation from reader). One key feature is the use of visuals in online writing that are not constrained by the text; visuals may replace text or serve as visual puns, and text may try to become as vivid as visuals (ekphrasis). If writing is a metaphor for thought (and writing systems for our sense of self), then “multilinear” hypertext may be more like the associational mind thinks and reflect our postmodern identity. Writing spatializes time (i.e., speech)- going from print to hypertext is in some ways like returning to conversational modes of oral dialogue.

Comments: Bolter suggests that the increased use if visuals is an attempt to get rid of arbitrary symbol systems (i.e., the alphabet) and return to picture writing. However, modern picture writing differs from preliterate picture writing in that more abstraction can be expressed (e.g., icons). Also discusses semiosis (movement from one sign to another via reference); to read is to interpret semiotic meaning in the difference between the signs (e.g., intertextuality, linking).

Links to: Hayles (hypertext literature); Ong (writing systems and thought)

Categories
exam readings identity knowledge work politics research methods/philosophy

Exam reading: “Cyborgs, simians, and women”

This book, by Donna Haraway, has some very influential ideas about identity and politics in an increasingly technologically-mediated world. Unfortunately, there’s a lot in here that I really can’t agree with- namely, her attack on science from a feminist/Marxist perspective. While I agree with her thesis that science often has been used to justify oppression of various sorts, my perspective is that this is a misappropriation of science for political purposes, rather than an unavoidable outcome of objective rationality.

I’m not arguing that scientists are pure, with no hidden biases and motivations for their research. Everyone has biases, but it seems that most scientists, when confronted with evidence of their biases, are willing to rethink their views. Are there systemic barriers to such change? In some cases, yes. But I feel that these are things that can be attacked without effectively throwing away our best system of tools for proving that bias exists, and that it’s inappropriate.

Summary: Three main sections: 1) exploration of the oppressive nature of objective science; 2) exploration of the impossibility of describing a single “women’s” or “women of color’s experience”; and 3) description of an emerging cyborg identity in which nature, culture, and technology intertwine to shape us. Subject/object distancing in science is implicated in oppression and patriarchal dominance politics (primate & human health research in particular are used to perpetuate repressive ideologies); what we need is a new situated objectivity that recognizes the limitations of our partial perspective and regards objects of knowledge as “material-semiotic actors” (constantly generating their own meanings). The cyborg concept can be seen either as the ultimate domination of nature by technology or as the fusion of nature, the human, and technology. Biological metaphors become cultural metaphors; for example, the postmodern view of no unitary identity has parallels in biology (different cell lines in immune system, women sometimes as fetus containers). She describes the information society as an “informatics of domination:” workers are becoming feminized- low job security, replaceable, shredding of the social safety net, cultural impoverishment.

Comments: After reading this book, and a few other papers on the subject, I’m still unsure what “feminist science” would entail. I see a possible continuum in Haraway’s book ranging from using standard scientific methods to investigate consistent bias within a field (e.g., asking questions about female kinship patterns in apes, rather than the traditional focus on male aggression), to a separate set of standards of evidence (and a new epistemology) for feminist science vs. mainstream science (e.g., admission of folk medicine as science because it’s a deeply-held belief), to the idea that all science is just rhetoric, used to construct social reality. While Haraway explicitly rejects that third view, she is vague about the specifics of what she wants to see. So, she does provide specific examples of the 1st view, so maybe this type of criticism is sufficient for her, but also places a lot of weight on redefining objectivity, which would seem to indicate that she wants a new epistemology. I absolutely agree with the first view, and absolutely disagree with the latter two.

Links to: Liu (politics of info economy); Hayles (top-down vs. emergent systems theory-H. book is older, so perhaps she addresses this in later work?)

Categories
exam readings geekery hypertext identity

Exam reading: “Hamlet on the holodeck”

Janet Murray’s “Hamlet on the Holodeck” is a 1997 book that tries to reconcile “good” storytelling with not-fully-realized new media. Yes, there are several Star Trek references. Unfortunately, most are to Voyager…

Summary: Murray explores how narrative may change in stories based in new interactive media. For her, the key to avoiding fears of VR addiction and culturally-depauperate stories is to concentrate on meaningful storytelling. She begins by describing storytelling in new media genres (MUDs, 3-D movies, simulators, etc.), the boundaries of which will eventually blur. There are four characteristics of digital environments that make them new: procedural construction, participation, spatial dimension, and encyclopediac scope (the first two = interactivity). Because of these characteristics, new media environments can: satisfy the desire for immersion in virtual worlds, give audiences agency (ability to take meaningful action), and offer a mutable environment that allows transformation of traditional storylines. She also outlines several possible “cyberdrama” formats, some of which are now in use: “hyperserials” (TV shows with added online dimension), “mobile perspective” programs, and virtual worlds for roleplaying. Meaningful storytelling in new media should seem true to the human condition. It could use stock formulas or characters in new ways- example of bardic performances that vary stock elements to create new compositions. Or it could explore possibilities of telling stories with expanded scope (a system perspective), or just explore world-building possibilities.

Comments: Since my focus is not on the narrative properties of new media, I’m skipping a lot of detail in that area (e.g., ways to create plot in a non-linear setting, game goals vs. plot-driven goals, ways to create responsive & believable virtual characters using AI). Provides some good links between more traditional ways to construct stories and ways to use new technologies. World-building ideas make me think of MMORPGs.

Links to: Hayles (e-lit & narrative); Turkle (psychology of interactive environments & AI characters); Manovich (components of interactivity)

I’ll add this analysis of the Holodeck as a narrative device (rather than Turkle’s Holodeck-as-technology):

Categories
exam readings knowledge work pedagogy tech design

Exam reading: “Datacloud”

I often wonder about positive interpretations of the new, “postmodern” information-dense and chaotic work environment. For example, how well will this exciting new world of info-surfing as a model hold up, given recent evidence that we really can’t multitask? And there are also significant issues skipped over in most discussions of the changing work environment: the wide divergence of incomes between certain classes of knowledge workers and non-knowledge workers, the digital divide, and class stratification.

I certainly don’t know how these things are going to play out. But here’s another exam reading that doesn’t really address them head-on: Johndan Johnson-Eilola’s “Datacloud.” It’s probably a scope issue- he does at least mention these issues, but his focus is clearly elsewhere:

Summary: In this book, Johnson-Eilola tries to describe changes in the work environment occurring in information-based jobs, and how both education and computer workspaces should be changed to facilitate this new way of working. Describes standard model of how the “symbol-analytic” (S-A) workplace is becoming the new postmodern paradigm: fragmented, mobile, computerized. contingent, situation-specific solutions, under-defined goals, playful, and helping facilitate a concept of the self that is fluid and changeable. He focuses on how different articulations (“suggestions about acceptable meanings”) of technology can be sites of resistance to dominant cultural trajectories. “Articulation theory” is a postmodern adaptation of Marxism that says that subjects are constructed within social/class contexts, but that these sites of negotiation allow the subject some agency. His main focus is on how workspace design (mostly the computer interface, but also the physical space is important) can be changed to facilitate S-A work. S-A work requires the ability to navigate between complex spatial data representations, communicate at need with other workers, and be able to display some information in different, more permanent locations (e.g., whiteboards). Education spaces need to change to get students comfortable with these immersive work environments. Students also must learn how to be creative about representing and using information (rather than just using ppt or xls defaults).

Comments: Mentions the ideological nature of articulations (e.g., current word processing programs make it hard to work in a dynamically interlinked environment b/c of clunky embedding), but doesn’t go into too much detail about group-level politics. Characterizes hyperspace as linear on temporal scale, rather than as a fluid network that gives up temporality. Blogs are an example of new (book was published in 2005) emergent symbolic-analytic spaces: dynamic production sites with RSS feeds to let readers experience them in different temporal & spatial sequences.

Links to: Liu (historicizes symbolic-analytic work); Brown & Duguid (less focus on specifics of work envt.); Spinuzzi (approaches topic from network theory); Bolter (hypertext)