Categories
exam readings information representation research methods/philosophy

Exam reading: “Electronic textual editing”

Electronic Textual Editing” describes the main data archiving standards effort for the humanities. It’s not really a dynamic read- how thrilling can a collection of essays on XML and database construction really be? But it’s a useful overview of the TEI:

Summary: A collection of essays dealing with editing and archiving issues with electronic texts. Focuses on the Text Encoding Initiative: project to create best practices and markup languages (SGML, then XML) for the humanities. It can be broken into three main parts: general guidelines for creating and digital editing of scholarly editions, case studies and lessons for editing both older and modern texts, and specific technical methods (e.g., digitizing documents, dealing with character encoding and markup). For scholarly editions, important considerations are accuracy in documentation and thorough inclusion of text variants. Digital editions/collections allow researchers to create quite accurate versions of a text (e.g., scanned copies), collect multiple versions of said document, and dynamically link them all. The functions of markup language include labeling (and linking) sites of variability among texts, and replicating structural/layout elements in electronic text versions of originally print documents. A few of the case studies had some interesting points. The digitizer of the Canterbury Tales points to the importance of having explicit principles for transcription before starting, and discusses how reading an electronic version of the text changes the editing & reading experience. For the creator of an electronic Thomas Edison archive, the major task seemed to be developing a good database to link text- & image-based documents. For poetry digitization, it was key to pay attention to both words and layout.

Comments: Glossed over the detailed technical essays, and focused on what I thought were the most salient points. Most authors were quite keen on XML for its formatting abilities, which I’ve used derivatives of (XHTML & CSS). As I’m not involved with archiving or creating digital editions, this was more of an overview of this area of T&T.

Links to: McGann (TEI, digital archives); Headrick (classification systems in general)

Categories
identity research methods/philosophy science studies

More on science, culture, and feminism

In my post yesterday about Donna Haraway’s book, “Cyborgs, simians, and women,” I talked about how it called for a rethinking of how primate research and human culture shape one another. More importantly, I argued that science doesn’t have to be anti-feminist just because it’s science. Here’s a timely example:

To illustrate how powerful the influence of culture can be for primate societies consider the most extreme example of a sexually coercive species: savanna baboons. Males have been known to viciously maul a female that has rejected their advances and the level of male aggression is strongly correlated with their mating success. However, in a unique natural experiment Stanford primatologist Robert Sapolsky observed what developed when the largest and most aggressive males died out in a group known as Forest Troop (because they were feeding at the contaminated dump site of a Western safari lodge). In the intervening years Forest Group developed a culture in which kindness was rewarded more than aggression and adolescent males who migrated into the troop adopted this culture themselves.

Read the rest of this post- it’s a great example of how scientific tools and methods are not necessarily tied to maintaining traditional, oppressive social frameworks, as suggested by Haraway.

Categories
exam readings identity knowledge work politics research methods/philosophy

Exam reading: “Cyborgs, simians, and women”

This book, by Donna Haraway, has some very influential ideas about identity and politics in an increasingly technologically-mediated world. Unfortunately, there’s a lot in here that I really can’t agree with- namely, her attack on science from a feminist/Marxist perspective. While I agree with her thesis that science often has been used to justify oppression of various sorts, my perspective is that this is a misappropriation of science for political purposes, rather than an unavoidable outcome of objective rationality.

I’m not arguing that scientists are pure, with no hidden biases and motivations for their research. Everyone has biases, but it seems that most scientists, when confronted with evidence of their biases, are willing to rethink their views. Are there systemic barriers to such change? In some cases, yes. But I feel that these are things that can be attacked without effectively throwing away our best system of tools for proving that bias exists, and that it’s inappropriate.

Summary: Three main sections: 1) exploration of the oppressive nature of objective science; 2) exploration of the impossibility of describing a single “women’s” or “women of color’s experience”; and 3) description of an emerging cyborg identity in which nature, culture, and technology intertwine to shape us. Subject/object distancing in science is implicated in oppression and patriarchal dominance politics (primate & human health research in particular are used to perpetuate repressive ideologies); what we need is a new situated objectivity that recognizes the limitations of our partial perspective and regards objects of knowledge as “material-semiotic actors” (constantly generating their own meanings). The cyborg concept can be seen either as the ultimate domination of nature by technology or as the fusion of nature, the human, and technology. Biological metaphors become cultural metaphors; for example, the postmodern view of no unitary identity has parallels in biology (different cell lines in immune system, women sometimes as fetus containers). She describes the information society as an “informatics of domination:” workers are becoming feminized- low job security, replaceable, shredding of the social safety net, cultural impoverishment.

Comments: After reading this book, and a few other papers on the subject, I’m still unsure what “feminist science” would entail. I see a possible continuum in Haraway’s book ranging from using standard scientific methods to investigate consistent bias within a field (e.g., asking questions about female kinship patterns in apes, rather than the traditional focus on male aggression), to a separate set of standards of evidence (and a new epistemology) for feminist science vs. mainstream science (e.g., admission of folk medicine as science because it’s a deeply-held belief), to the idea that all science is just rhetoric, used to construct social reality. While Haraway explicitly rejects that third view, she is vague about the specifics of what she wants to see. So, she does provide specific examples of the 1st view, so maybe this type of criticism is sufficient for her, but also places a lot of weight on redefining objectivity, which would seem to indicate that she wants a new epistemology. I absolutely agree with the first view, and absolutely disagree with the latter two.

Links to: Liu (politics of info economy); Hayles (top-down vs. emergent systems theory-H. book is older, so perhaps she addresses this in later work?)

Categories
exam readings research methods/philosophy rhetoric tech design

Exam reading: “User-centered technology”

In this book, Robert Johnson explores technical design from a rhetorical perspective. His “system-oriented” and “user-oriented” distinction brings to mind this cartoon.

Summary: Johnson explores the relationship between technology and people from a technical communication perspective. He articulates two types of knowledge: expert theoretical knowledge and applied practical knowledge; traditionally, theoretical knowledge is valued more highly. End-users are often invisible in the design process, which can lead to problem-prone technologies. Johnson advocates getting users involved from the start of the design process, and incorporating their practical, task-based knowledge into design. He contrasts this approach (user-centered design) to system-centered and user-friendly design processes. Johnson grounds his book in a “user-centered rhetorical complex of technology:” a reworked version of the rhetorical triangle which places the user at the center; has the designer, the system, and the user tasks as the vertices; and places this relationship within concentric circles of general activities (learning, doing, producing), constraints of human networks (institutions, disciplines, community), and finally large social factors (culture, history). He emphasizes the importance of reflecting on assumptions about technical determinism when designing large projects. He also discusses the specific case of producing technical explanations for computer systems (e.g., documents should be organized in a task-oriented way). He ends by connecting the ends of technical writing pedagogy to those of rhetoric (focus on the user/audience, supposed to be working toward “the good”), and suggesting a service-learning approach for tech writing classes.

Comments: Johnson falls somewhere in between Feenberg and Norman on an axis of “politics and philosophy” vs. “design for easy use.” For non-technical writers, there are still some good design ideas here (though he does emphasize the application of his ideas to this field), mainly having user input throughout the design process, designing with specific tasks in mind, and avoiding a “design for dummies” approach.

Links to: Norman (user-friendly approach); Feenberg (phil./politics of technology); Gee (learning by doing)

Categories
exam readings hypertext research methods/philosophy

Exam reading: “Radiant textuality”

This is a summary of “Radiant Textuality,” by Jerome McGann. The book stems from the author’s work with applying digital tools to analysis of both literary and visual works. One of his major projects is the Rossetti Archive, which indexes the works of Dante Rossetti, an author/painter. Since my interests don’t lie in literary analysis of this sort, the things I pulled out of this book may not be McGann’s major points of emphasis.

Summary: In this book, McGann explores the applications of digital tools to critical analysis and interpretation of texts. He advocates a “quantum” model for textual analysis, which recognizes that textual interpretations are inherently variable. This method of analysis is performative, rather than traditionally interpretive/hermeneutic. He outlines two major methods for doing this, both of which can be aided by digital tools: 1) “deformations”-deliberately altering words or structure (using filters, in the case of images) to let you make out the underlying structural rules of the text; and, 2) using essentially a role-playing game method (“Ivanhoe Game”) to explore multiple possible constructions of the text. Rather than being vehicles for transmitting meaning, it’s more important to consider texts to be a set of algorithms that enable critical, introspective thinking about the text. Texts contain both graphical (design) and semantic signifying parts, and it’s the latter that have been the subject of traditional interpretation, where he focuses on the former. These “invisible” design elements (organizational & linguistic) constitute a “textual rhetoric” or bibliographic code. Studying “deformations” is useful because it gets at what the text doesn’t do; this lets us see the underlying textual rhetoric.

Comments: McGann’s “radiant textuality” of the title refers to writing (and other types of text) in which the medium is the message and promotes introspection, rather than writing for which the purpose is information transmission. He states that his methods are most appropriate for creative/poetic texts, rather than expository writing, because these texts deliberately lean towards the creative/design end of the spectrum (rather than the expository/”scientific” end). My interests lie on that other end of the spectrum, so I don’t really see myself putting his methods into practice that often. On an aside, the arrangement of this book could be a series of variations on a theme, exploring the same ideas  in various ways in a set of essays.

Links to: O’Gorman (Blake; informational vs. design elements of texts), Hayles (how interaction bet. user & interface co-creates [Hayles] or explores [McGann] texts), Tufte (questions form/design of texts), possibly Sullivan & Porter (quantum poetics rejects set methods of analysis or single interpretations)

Categories
exam readings pedagogy research methods/philosophy transparency visuals

Exam reading: “E-crit”

This post is a summary of E-Crit: Digital media, critical theory, and the humanities by Marcel O’Gorman. I’ve read this book before and used some of the concepts in a paper- I thought I would read something that was a bit review after the last book I read… After reading Opening Spaces, it was interesting to see how this book really focuses on postmodern methods without taking ethical considerations into account (though political considerations are part of it). The intersection of these two texts makes me think of a series of blog posts on iblamethepatriarchy.com which look at the intersection of feminist criticism and postmodern evaluations of art (pretty thought-provoking). Anyway, one of the comments to a post there said that exposure to feminist interpretation ruins all art, because you can no longer look at art without thinking about the material and social conditions under which that art was made. (Not entirely sure what the connection is here, but O’Gorman does a lot of postmodern art analysis as part of his argument.) So if you’re an art lover, maybe better not to follow that link…

Summary: O’Gorman is trying to lay out a shift in academic methods that will revitalize humanities work by taking advantage of possibilities inherent in digital media.  For him, academic disciplines are fragmented, hierarchical and print-centered, which leads to interpretation (hermeneutics) and repetition rather than creativity (heuritics).  He foregrounds three types of “remainder”/“others” of academic discourse: puns/nonlinear transitions, digital media, and imagery.  He introduces “hypereconomy”- the use of “hypericons” to connect a network of discourses and lead to intuitive exploratory linkages between them. One big emphasis is on picture theory: images are subjective (non-transparent) and in a struggle with text (think LOLcats-text and image can be contradictory and create new meanings).  He contrasts the educational strategies of Ramus (classifying & compartmentalizing knowledge without reference to random mnemonic devices) to the work of Wm. Blake (image/text contradictions, opposition to creating conformist students).  He calls hypereconomy a “technoromantic” method of expression- using subjective, affect-based interpretations of print and images to create a bricolage of sorts.  These constructs incorporate four primary images: personal, historical, disciplinary, and pop-culture (he adds in a written interpretive component when assigning them in his classes).  Part of what they do is promote shifts in the figure/ground relationships in images (via subjective interpretations, “nonsense” connections, and hyperlinking).  O’Gorman speculates that constant exposure to visual stimuli is leading to increased abstract & spatial reasoning.  He concludes by laying out a plan to rejuvenate humanities departments by incorporating digital media studies and criticism: this would add technological “rigor” but still let departments teach criticism of the changing social/technological environment.

Comments: O’Gorman’s main focus seems to be the hypereconomy method as a tool for invention, and the call to incorporate digital media into humanities departments as a way to subvert “technobureaucratic” management of universities seems a bit tacked-on.  Some of the visual theory he builds his argument upon (e.g., Gombrich’s “mental set” of interpretations) isn’t empirically supported (as I recall).  His concepts about non-transparent visuals & language have been the most useful things for me.  I probably fall into the traditional linear-enlightenment camp & am not convinced that hyperconomy projects can actually lead to useful critiques of institutions (a bit too materialist, I guess).  When I first read this book, I had a much stronger reaction to the anti-Enlightenment thread that runs through it- I’m either becoming inured to such a position or starting to reconcile the cognitive dissonances from my previous training…

Links to: Bolter (remediation, transparency)

Categories
exam readings research methods/philosophy

Exam reading: “Opening spaces”

So, one of the things I want to use this blog for is a place to post summaries of and thoughts on for my candidacy exam readings. We’ll see how consistent I end up being with this as I go along.

My first post is on Opening Spaces: Writing Technologies and Critical Research Practices by Patricia Sullivan and James Porter (1997). This book outlines the authors’ critical research philosophy. I had a difficult time getting through this book- part of that reaction stems from my own research orientation/background, which is very different from that of the authors. I felt that they oversimplified the traditional research process, which they characterize as picking a preset set of methods, applying, then writing about results (though I’m probably oversimplifying here). Granted, most research reports imply that this is how research works, glossing over changes in method, vagaries of the specific research situation, etc. I have to keep in mind that I’m coming from a very different theoretical background here.

Summary: The authors advocate critical research practices, in general and specifically in the context of technology and composition.  Their philosophy is that methodology should be heuristic and context-dependent, knowledge generated should be situated, and that praxis (critical practice) is key to generating knowledge.  Their emphasis is on practice, rather than ideology or methods (the other two elements of research).  Key concepts are that researchers should be alert for bias, conscious of situated and customized practice, willing to critically change methodology during research, aim to liberate study participants, focus on user-technology interactions (rather than technology only), consciously involve the researcher in the research, and highlight the implications of methods in writeup.  There is a distinction between fixed methods and malleable methodology.  Their rhetorical/political goals include: respecting difference, caring for others, promoting access to rhetorical procedures enabling justice, and liberating the oppressed through participant empowerment.  They offer several ways for enacting their ideas, all of which center on exploring tensions in the research process: disciplinary tensions (use methodological framework mapping; list multiple binaries); environmental tensions (mapping the research scene-location, technologies, events, relationships, data collection capabilities; metaphor analysis-participants’ vs. researchers’); between ideal methods and realizable positions (assumption analysis); and researcher-participant tensions (competing narratives; advocacy charting).

Comments: The authors strongly advocate for empowering participants, but object to “objective” measurements of study success.  How, then, can we determine if there has actually been a material improvement in participants’ situations? It seems like it’s enough to merely give them tools for liberation. In research examples, they cite an imperfect study in which examples illustrated the theory presented, but did not “challenge” or “complicate” it. This is counter to my previous understanding of the roles of theoretical framework and empirical examples in research (different field, different expectations).

Links to: Feenberg (instrumental/substantive views of tech), Johnson (User-centered design), Bolter & Landow (comment that their research implies technological determinism/substantive view of media)

Edited 8/28 to correct Johnson’s name.