Categories
exam readings visuals

Exam reading: “The Work of Art…”

As fascism was creeping through Europe, Walter Benjamin wrote “The Work of Art in the Age of its Technological Reproducibility.” In it, he lays out a politically-driven, positive view of the effects of new representational technologies on the relationship of society to art. While Benjamin’s view of the effects of this  new relationship was positive, his essay also warns of the forces of fascism trying to subvert this process and marry art to politics. A German Jew, he committed suicide in 1940 when detained in fascist Spain while trying to escape Germany for the U.S. A depressing ending, considering some of the optimistic(?) ideas in this piece.

Summary: Benjamin discusses the ways in which, he contends, the ready availability of photographs and film have changed the relationship between spectators and artistic works. For example, the original context of artistic works was that they were one of a kind (“authentic”), and viewed in ritualized ways (e.g., as religious icons, or contemplated during a museum visit). Mechanical reproducibility lets one take these works out of context and view them anywhere- particularly in distracted settings. He contends that this demystification of art is positive; art now becomes political (i.e., can be used for political ends in the education of the masses), not cultish. (One place in which cult value is hanging on is in portraiture.) With film, spectators are now “quasi-experts,” because they can also be on camera themselves- there is less reverence for the actors. The relationship of the masses to art is changed; critical appraisal and simple enjoyment are unified in works with high social impact (in works with low social impact, the converse is true). We now internalize art while in a state of distraction (e.g., in a movie theater), rather than placing ourselves into the artwork in traditional art appreciation; he likens this state of distraction to that of experiencing architecture while taking part in day-to-day activities.

Comments: As a Marxist, Benjamin frequently links the demystification of art with the advance of the proletariat, and the attempt to re-mystify art (and aestheticize politics) with fascism. I’m not sure how well his predictions have borne out re: the easy accessibility of art and raised political consciousness in the masses (I think that advertising research, for example, suggests something different). On another note, there are links here to some authors on scientific visuals. For example, Benjamin states that, with photography and cinema, artistic and scientific (informational) content of art is identical- I think the situation is more complex than that: there are multiple levels of meaning that can be experienced in such representations.

Links to: Ong (representational practices); Headrick (technologies of representation)

Categories
exam readings information representation

Exam reading: “Orality and literacy”

I felt like I needed to do a re-read today and take a little bit of a break. “Orality and Literacy,” by Walter Ong, is about the ways that writing technologies have affected human ideas and expression. It’s a book I’ve read in two classes thus far.

One of the things I had fun with previously with this book was an assignment to represent the different developments discussed in it in timeline format. Since Ong primarily focuses on European history (though he does provide examples from other cultures), I wanted to include developments from other regions, and situate the whole project within the entire timeline of human history. Part of my reason for this is my background with Hawaiian culture- a culture in which the transition from orality to literacy has happened fairly recently. It required a lot of thought, some big changes in scale, and several hours on Illustrator, but I’m pretty happy with it.

Here’s a preview of the timeline:

Ong timeline preview

Here’s the full pdf. (Dates are approximate.)

And here’s my summary of the book:

Summary: Taking oral culture as a baseline, Ong explores the impacts that writing technologies (first script, then print, and now electronic technologies) have had on human expression, patterns of thought, and society at large. For example, Ong states that oral cultures were largely communal, and that storytelling relied on creative and situation-dependent groupings of stock formulas and characters. Learning was rooted in apprenticeship and daily practice, and naming gave people power over objects. The technological shift of writing led to solitary contemplation of ideas, complex storytelling, abstract learning, and using names as tags in text “containers.” Print enhanced these developments, and led to a more sophisticated use of visual space, organization of information (e.g., indexing, glossaries), and dominance of a few writing systems. Finally, electronic technologies are continuing some print trends (e.g., spatializing information, a high level of text processing), but are also promoting a turn to a “secondary orality” of participatory expression. Ong addresses several other issues important to cultural scholars, such as a shift from aural to visual sense primacy that writing helped promote, the rise of the unitary self, and the incompleteness of a sender/receiver model of communication.

Comments: This is a foundational work that informs the entire field of T&T, and contains a large amount of material (e.g., effects on memory, theoretical differences between writing systems-which I’ve glossed over). Ong states that most literary historians of his time approached oral cultures from an unconscious written perspective; he tries to point out the biases that can creep in to scholarship from this perspective, which are important to keep in mind. I remain not entirely convinced that there actually was a dramatic aural/visual shift of the sort that Ong (and other theorists) proposes, but that’s only one of the main themes of this book.

Links to: McLuhan (communication media effects); Benjamin (changing technologies’ effects on perception of works); Headrick (communication media); Turkle (identity & writing technology); Murray (narrative and technology); Bolter (remediation during media shifts); some connections to visuals, info organization

Categories
exam readings networking networks

Exam reading: “Network”

Back to my exam readings in this post… “Network,” by Clay Spinuzzi, is an account of the operations of a telecom company: its development, problems, how it operates successfully, and how work that seems simple from the outside is really quite complex. A large part of the book is dedicated to exploring two theories describing networks (about which, more below).

While not long, it is definitely a dense book- not excessive repetition, but it did take longer for me to finish than I had estimated. I’ll read some short texts next so I can feel better about crossing things off my list…

Summary: In this book, Spinuzzi uses two theories to describe the structure and function of a telecommunications company: Activity Theory (AT) and Actor-Network Theory (ANT). He chose a telecom company as an example of the highly decentralized type of knowledge work that is becoming more common in modern organizations. AT is a theory of learning and development through interaction, based largely on Marxist dialectics; ANT is a descriptive theory that focuses on how shifting relationships among actors in a network help define those actors, based largely on rhetoric. Spinuzzi spends a lot of time exploring both similarities and differences in these two theories, and giving examples of how they apply to situations at the company. Four characteristics of highly-networked organizations are: members have heterogeneous skills/tasks, members are multiply linked, transformative shifts can change the goals of the network, and certain processes within the network are “black-boxed” (appear to be simple from the outside when they are, in fact, not). Texts help connect the different actors within the network in three ways: they are stable traces of (ephemeral) ideas, the structure of genre helps organize unfamiliar information into familiar patterns, and they act as boundary objects among actors operating within the network in different contexts. According to Spinuzzi, each of these theories can be used to describe different aspects of “net work,” although he concludes that AT (with its developmental focus) is most appropriate to use in similar future studies, if dialogue and rhetoric (strengths of ANT) are taken into account.

Comments: Although there are some interesting political implications of knowledge work here (e.g., worker segregation by education, the “homework economy,” the necessity of continual retraining), the most immediately useful aspects of this book for me will probably be the focus on learning in networks and how texts can function in shaping networks. There is a lot of material in this book, and it’s also useful as an introduction to AT and ANT that is grounded in specific examples.

Links to: Tomlinson (theoretical aspects of networks); Haraway (cyborg identity, workers’ need for constant learning); Brown & Duguid (information networks)

Categories
exam readings information representation visuals

Exam Reading: “When Information Came of Age”

It’s been a while since I’ve posted, since I’ve been wrapping up an internship at the Cornell Lab of Ornithology. I’ll post about what I’ve done there later… For now, I’ve got a new book summary. “When Information Came of Age”, by Daniel Headrick, was actually interesting to read at this point. Circling back to the internship (which I really will discuss later), I’ve been dealing with issues of representing information about bird families in different formats. This book discusses the histories of information systems that were a big part of this process. But more on that later…

Summary: Headrick proposes that the current “Information Age” is only one of many historical information revolutions; in this book, he focuses on the information revolution of the Enlightenment. He outlines five categories of information systems: classifying/organizing, transforming, display, storage/retrieval, and communicating. His thesis is that developments in these information systems in this time period, coupled with subsequent technological inventions, laid the groundwork for the Information Age. During this period, demographic, cultural, political, and economic changes helped create an information build-up that could only be made sense of by inventing new information systems. So, for example, scientific nomenclature and classification systems were developed that suggested explanations for phenomena (e.g., the chemical classification system suggested possible new compounds). Statistics were used to transform demographic, political, and economic information that was beginning to be collected. Visual information displays (maps, graphs, and thematic maps) were used to display large datasets efficiently and in an easily-recalled manner. Cross-referenced dictionaries and encyclopedias were successful at disseminating current, easy-to-access information to the general public (in contrast to dense, thematically-linked former formats). Postal and telegraphic systems (visual and electric) were devised to transmit messages; these went from private messenger services to restricted government systems to more open government systems.

Comments: This book gives a good overview of the development of information systems, though some of the chapters are more comprehensive than others. His emphasis on information systems, rather than previous technologies that facilitated them (e.g., the printing press) or subsequent technological innovations, was an interesting choice (though apparently he’s addressed later periods in other books.) This would probably be a useful book to use in a History of T&T course. While heavy on the names and dates, it covers a really interesting period of history (see my Blituri project, and I will add that the Baroque Cycle is a mostly-excellent series set right before this period :).

Links to: Tufte (information representation); Benjamin (technologies of representation); Ong (social and technical aspects of changing representational practices)

Categories
exam readings hypertext research methods/philosophy

Exam reading: “Radiant textuality”

This is a summary of “Radiant Textuality,” by Jerome McGann. The book stems from the author’s work with applying digital tools to analysis of both literary and visual works. One of his major projects is the Rossetti Archive, which indexes the works of Dante Rossetti, an author/painter. Since my interests don’t lie in literary analysis of this sort, the things I pulled out of this book may not be McGann’s major points of emphasis.

Summary: In this book, McGann explores the applications of digital tools to critical analysis and interpretation of texts. He advocates a “quantum” model for textual analysis, which recognizes that textual interpretations are inherently variable. This method of analysis is performative, rather than traditionally interpretive/hermeneutic. He outlines two major methods for doing this, both of which can be aided by digital tools: 1) “deformations”-deliberately altering words or structure (using filters, in the case of images) to let you make out the underlying structural rules of the text; and, 2) using essentially a role-playing game method (“Ivanhoe Game”) to explore multiple possible constructions of the text. Rather than being vehicles for transmitting meaning, it’s more important to consider texts to be a set of algorithms that enable critical, introspective thinking about the text. Texts contain both graphical (design) and semantic signifying parts, and it’s the latter that have been the subject of traditional interpretation, where he focuses on the former. These “invisible” design elements (organizational & linguistic) constitute a “textual rhetoric” or bibliographic code. Studying “deformations” is useful because it gets at what the text doesn’t do; this lets us see the underlying textual rhetoric.

Comments: McGann’s “radiant textuality” of the title refers to writing (and other types of text) in which the medium is the message and promotes introspection, rather than writing for which the purpose is information transmission. He states that his methods are most appropriate for creative/poetic texts, rather than expository writing, because these texts deliberately lean towards the creative/design end of the spectrum (rather than the expository/”scientific” end). My interests lie on that other end of the spectrum, so I don’t really see myself putting his methods into practice that often. On an aside, the arrangement of this book could be a series of variations on a theme, exploring the same ideas  in various ways in a set of essays.

Links to: O’Gorman (Blake; informational vs. design elements of texts), Hayles (how interaction bet. user & interface co-creates [Hayles] or explores [McGann] texts), Tufte (questions form/design of texts), possibly Sullivan & Porter (quantum poetics rejects set methods of analysis or single interpretations)

Categories
exam readings visuals

Strategizing and mapping, part 2…

For my candidacy exams, I’m focusing on two subject areas: public understanding of science (PUoS) and informatics of community science (or, how communities use cyberinfrastructure for science projects).

Both of these areas contain a wide variety of concepts, and there are many connections between the two. I’ve created a concept map to help me organize how these ideas fit together. You should be able to click on it for a legible version.

Concept map for ideas connected during my dissertation research

What I’ve tried to do here is connect up the concepts from various readings to some of the key ideas. Three key topics on the left (in hexagons) primarily come from the PUoS literature: participatory genres of science learning, more traditional “communication” genres, and some key concepts that are critical to address in order to have public understanding of science. These three represent important areas of focus in the PUoS literature.

Jumping to the right, the oval for PUoS is what I plan to do research on. I’m interested on how we can use online tools to improve PUoS. The general idea with this map is that we use the two diamonds (and connected ideas) in the center to get to this point. The topics in the diamonds are primarily from the informatics/community literature: physical and social components of community science networks, and theories for understanding these networks. These topics either mediate or theorize specific approaches to PUoS, so we can get to an endpoint where we can evaluate PUoS. Hopefully this will all make sense.

A good concept map shouldn’t need this much explanation, but this is a working draft, mainly intended as a tool to help me put a wide variety of readings together into some sort of coherent form. What I’d ideally like to do with this is connect up these concepts with the readings, which will add on another layer to this diagram. At least, that’s the plan.

Categories
exam readings

Strategizing* and mapping, part 1…

One of my major projects at the moment is getting together reading lists for my dissertation candidacy exams. This means looking at two entire (sub)fields of research (as well as the core T&T field), trying to distill those fields down into a set of key publications, and then immersing myself in the ideas contained therein. It’s a big process, but I’m  optimistic about the way it’s turning out.

I’m approaching this differently than the way I generated my reading list for my Master’s thesis, which was not really a focused approach. I remember spending a lot of time in the library (yes, back in the day when journal articles were actually kept in these paper things called “journals” that you went to physically photocopy) looking up the most recent papers on stream algae. Then reading those papers, looking for relevant information, and backtracking those references to find older papers. I suppose this is a pretty typical way to approach research for someone who doesn’t have a well-thought-out research strategy.  It’s not that I didn’t know how to use database for keyword searches, but I was definitely doing some flailing while trying to grapple with the amount of information out there.

That experience with inefficiency taught me that a good research strategy is not something that’s necessarily going to drop out of the sky. There’s a couple of key things I’ve tried to pull out of this experience that have been helpful for me, at this point in my graduate experience:

  • I have a better handle on the underlying subject areas than I did as a beginning Master’s student. Which is important, even though it’s obvious- it leads to the next few things. But the key thing here is being widely-read before beginning focused research.
  • Being able to evaluate scale and scope of a field- helps establish the boundaries and key concepts to focus on.
  • Being familiar with the major arguments/points of contention- key concepts and threads to be aware of.
  • Since I plan to do an empirical study as part of my research, it’s been helpful to talk to people outside of my institution who are doing on-the-ground research. Different perspectives, and it’s useful to see what current trends in the field are.
  • …On the subject of current trends, being aware of what funding agencies are focusing on in is probably a good idea, career-wise. (To clarify: choosing dissertation topic to a specific funding program=bad; shaping research language to fit topic into a broad area of interest=good? This is certainly a topic worth further exploration.)
  • Also, obviously talking to people on my committee has been important in shaping this process.
  • Evaluating impact of specific papers. I’m not sure to what extent article impact factors is a sciences vs. humanities concept, but certainly the level of interest in the specific ideas brought forth in a publication is indicative of the centrality of it to a field (or at least a specific debate in that field).
  • Keeping in mind that this is not all the reading I’ll be doing for my project. Leaving something out does not mean I won’t be using those ideas in the future.

This is pretty basic stuff and not really all that original, but I thought I should try to be introspective about it, with the idea that at some point in the future I might be giving other people advice about this process. It looks like the key things to do are read widely before beginning research, define the scope of the (sub)field, and get multiple perspectives. I’m sure I’m missing some important points here, though. And I’ll probably have different things to emphasize as I make my way through this process.

Well, this turned into a longer post than I thought it would! Next time, I’ll talk about my attempts to use concept maps to actually organize this information…

*”Strategizing”: not a real word, but fun to use…

Categories
exam readings networks tech design

Exam Reading: “Greening through IT”

This book, “Greening through IT” by Bill Tomlinson, is one of the newer ones added to the T&T core reading list, and addresses one area that I think the core list as a whole ignored previously: the broad-scale material basis of electronic technologies. While many of the theorists covered in the program emphasize the connections between mind and materiality (e.g., the physical experience of interacting with a computer is part of what makes reading online different from reading a book), no one thus far has addressed the broader ecological implications of these new technologies.

I would venture that most theorists approaching the T&T field from a critical theory perspective are (understandably) not aware of the ecological sustainability issues surrounding electronic tech- for example, electricity use, e-waste, programmed obsolescence of devices. Most authors focus on the social/philosophical implications of new technologies, and there’s a definite assumption overall that we will be able to continue to physically make and use these technologies in the future, without too much consideration of natural resource limitations. Even the authors who focus on “materiality” of technology focus on the individual user-machine interaction.

So there’s a need in the program for attention to these issues (which are a main concern of mine, given my background in ecology). I think Tomlinson’s book does a decent job of addressing them. It’s not the perfect book on this issue for this program- I can see some of the more theory-centered students discounting it because of its low theory quotient (and that apparently annoying “evidence is used to support my theory, not contest it” thing). However, it does provide a needed perspective to the program, and I’m not sure what an alternative text that addresses these issues might be…

Summary: Discusses potential uses of information & communication technologies (ICT) for environmental sustainability. Tomlinson lays out a framework for “extended human-centered computing” (EHCC), which requires consciousness of scale (temporal, physical, complexity) when analyzing problems, guides system development in green directions, and compares technologies in a range of time/space/complexity scales to identify gaps not being addressed. There’s detail about various environmental problems, social barriers to change, and how we can use IT to address both of these large areas. There’s a lot of detail, but it basically boils down to expanding our sense of/ability to cope with large time/space/complexity scales. Touches on three orders of effects of technologies, which need to be considered: 1st (direct effects), 2nd (specific impacts on other economic sectors), and 3rd (general social or cross-industry effects). He breaks down his discussion of pathways for using green IT ides into industrial, educational, personal motivation, and collective action categories (a lot of detail is outlined for each category). He presents several case studies in education, personal data tracking, and collective action, and discusses how each worked (or didn’t).

Comments: This book was heavy on examples, and perhaps short on theory. It would have been nice to see how the EHCC framework was used specifically to address gaps in scale (is there a heuristic for applying it in specific cases?). In the case studies, it seemed that the more controlled the environment, the better the technology worked for its designed purpose, e.g., the museum display worked better than the online programs that utilized crowdsourcing. I’d like to see more research on whether crowdsourcing/networking actually works for more than just getting Betty White on SNL (granted, this area of research is in its infancy). It’s also possible that more advanced research in museum displays/childhood education in general are responsible for this effect. The question seems to be how to get adults to buy into some of these ideas. Some of the cited examples (Indian fishermen) seemed to be more effective, though that could be because he was emphasizing the positibes. The context/discussion of the theory of punctuated equilibrium was odd-with his emphasis on the importance of metaphors that work on more than just the surface (an idea I feel strongly about), this bugged me.

Links to: Norman (technology design), Feenberg (applied case of design with technical and social ends in mind), Spinuzzi (objects & network formation?), Johnson (design for a purpose, though T. specifically focuses on the larger system rather than the user- the opposite direction from J?)

Edited 8/28 to add links, correct Johnson’s name.

Categories
exam readings pedagogy research methods/philosophy transparency visuals

Exam reading: “E-crit”

This post is a summary of E-Crit: Digital media, critical theory, and the humanities by Marcel O’Gorman. I’ve read this book before and used some of the concepts in a paper- I thought I would read something that was a bit review after the last book I read… After reading Opening Spaces, it was interesting to see how this book really focuses on postmodern methods without taking ethical considerations into account (though political considerations are part of it). The intersection of these two texts makes me think of a series of blog posts on iblamethepatriarchy.com which look at the intersection of feminist criticism and postmodern evaluations of art (pretty thought-provoking). Anyway, one of the comments to a post there said that exposure to feminist interpretation ruins all art, because you can no longer look at art without thinking about the material and social conditions under which that art was made. (Not entirely sure what the connection is here, but O’Gorman does a lot of postmodern art analysis as part of his argument.) So if you’re an art lover, maybe better not to follow that link…

Summary: O’Gorman is trying to lay out a shift in academic methods that will revitalize humanities work by taking advantage of possibilities inherent in digital media.  For him, academic disciplines are fragmented, hierarchical and print-centered, which leads to interpretation (hermeneutics) and repetition rather than creativity (heuritics).  He foregrounds three types of “remainder”/“others” of academic discourse: puns/nonlinear transitions, digital media, and imagery.  He introduces “hypereconomy”- the use of “hypericons” to connect a network of discourses and lead to intuitive exploratory linkages between them. One big emphasis is on picture theory: images are subjective (non-transparent) and in a struggle with text (think LOLcats-text and image can be contradictory and create new meanings).  He contrasts the educational strategies of Ramus (classifying & compartmentalizing knowledge without reference to random mnemonic devices) to the work of Wm. Blake (image/text contradictions, opposition to creating conformist students).  He calls hypereconomy a “technoromantic” method of expression- using subjective, affect-based interpretations of print and images to create a bricolage of sorts.  These constructs incorporate four primary images: personal, historical, disciplinary, and pop-culture (he adds in a written interpretive component when assigning them in his classes).  Part of what they do is promote shifts in the figure/ground relationships in images (via subjective interpretations, “nonsense” connections, and hyperlinking).  O’Gorman speculates that constant exposure to visual stimuli is leading to increased abstract & spatial reasoning.  He concludes by laying out a plan to rejuvenate humanities departments by incorporating digital media studies and criticism: this would add technological “rigor” but still let departments teach criticism of the changing social/technological environment.

Comments: O’Gorman’s main focus seems to be the hypereconomy method as a tool for invention, and the call to incorporate digital media into humanities departments as a way to subvert “technobureaucratic” management of universities seems a bit tacked-on.  Some of the visual theory he builds his argument upon (e.g., Gombrich’s “mental set” of interpretations) isn’t empirically supported (as I recall).  His concepts about non-transparent visuals & language have been the most useful things for me.  I probably fall into the traditional linear-enlightenment camp & am not convinced that hyperconomy projects can actually lead to useful critiques of institutions (a bit too materialist, I guess).  When I first read this book, I had a much stronger reaction to the anti-Enlightenment thread that runs through it- I’m either becoming inured to such a position or starting to reconcile the cognitive dissonances from my previous training…

Links to: Bolter (remediation, transparency)

Categories
exam readings research methods/philosophy

Exam reading: “Opening spaces”

So, one of the things I want to use this blog for is a place to post summaries of and thoughts on for my candidacy exam readings. We’ll see how consistent I end up being with this as I go along.

My first post is on Opening Spaces: Writing Technologies and Critical Research Practices by Patricia Sullivan and James Porter (1997). This book outlines the authors’ critical research philosophy. I had a difficult time getting through this book- part of that reaction stems from my own research orientation/background, which is very different from that of the authors. I felt that they oversimplified the traditional research process, which they characterize as picking a preset set of methods, applying, then writing about results (though I’m probably oversimplifying here). Granted, most research reports imply that this is how research works, glossing over changes in method, vagaries of the specific research situation, etc. I have to keep in mind that I’m coming from a very different theoretical background here.

Summary: The authors advocate critical research practices, in general and specifically in the context of technology and composition.  Their philosophy is that methodology should be heuristic and context-dependent, knowledge generated should be situated, and that praxis (critical practice) is key to generating knowledge.  Their emphasis is on practice, rather than ideology or methods (the other two elements of research).  Key concepts are that researchers should be alert for bias, conscious of situated and customized practice, willing to critically change methodology during research, aim to liberate study participants, focus on user-technology interactions (rather than technology only), consciously involve the researcher in the research, and highlight the implications of methods in writeup.  There is a distinction between fixed methods and malleable methodology.  Their rhetorical/political goals include: respecting difference, caring for others, promoting access to rhetorical procedures enabling justice, and liberating the oppressed through participant empowerment.  They offer several ways for enacting their ideas, all of which center on exploring tensions in the research process: disciplinary tensions (use methodological framework mapping; list multiple binaries); environmental tensions (mapping the research scene-location, technologies, events, relationships, data collection capabilities; metaphor analysis-participants’ vs. researchers’); between ideal methods and realizable positions (assumption analysis); and researcher-participant tensions (competing narratives; advocacy charting).

Comments: The authors strongly advocate for empowering participants, but object to “objective” measurements of study success.  How, then, can we determine if there has actually been a material improvement in participants’ situations? It seems like it’s enough to merely give them tools for liberation. In research examples, they cite an imperfect study in which examples illustrated the theory presented, but did not “challenge” or “complicate” it. This is counter to my previous understanding of the roles of theoretical framework and empirical examples in research (different field, different expectations).

Links to: Feenberg (instrumental/substantive views of tech), Johnson (User-centered design), Bolter & Landow (comment that their research implies technological determinism/substantive view of media)

Edited 8/28 to correct Johnson’s name.