Friday, January 14, 2011

Week 2: The Beginning of Hypertext and the Web

What is "new" about "new media"?




What are the characteristics, both technical and social, of new media? 
How does new media transform and "remediate" earlier media practices?


As noted in the lecture notes, here is an excerpt from Bolter and Guisin's Remediation:


Bolter, J. D. and Grusin, R. (2000). Remediation: Understanding New Media. The MIT Press, 1st edition.
 (excerpts selected and titled by course instructor)

Immediacy and Hypermediacy


Immediacy is our name for a family of beliefs and practices that express themselves differently at various times among various groups, and our quick survey cannot do justice to this variety. The common feature of all these forms is the belief in some necessary contact point between the medium and what it represents. For those who believe in the immediacy of photography, from Talbot to Bazin to Barthes, the contact point is the light that is reflected from the objects on to the film. This light establishes an immediate relationship between the photograph and the object. For theorists of linear-perspective painting and perhaps for some painters, the contact point is the mathematical relationship established between the supposed objects and their projection on the canvas. However, probably at no time or place has the logic of immediacy required that the viewer be completely fooled by the painting or photograph. Trompe l'oeil, which does completely fool the viewer for a moment, has always been an exceptional practice. The film theorist Tom Gunning (1995) has argued that what we are calling the logic of transparent immediacy worked in a subtle way for filmgoers of the earliest films. The audience members knew at one level that the film of a train was not really a train, and yet they marveled at the discrepancy between what they knew and what their eyes told them (114-133). On the other hand, the marveling could not have happened unless the logic of immediacy had had a hold on the viewers. There was a sense in which they believed in the reality of the image, and theorists since the Renaissance have underwritten that belief. This "naive" view of immediacy is the expression of a historical desire, and it is one necessary half of the double logic of remediation. (pp. 30-31)
As a counterbalance [to immediacy] hypermediacy is more complicated and various. In digital technology, as often in the earlier history of Western representation, hypermediacy expresses itself as multiplicity. If the logic of immediacy leads one either to erase or to render automatic the act of representation, the logic of hypermediacy acknowledges multiple acts of representation and makes them visible. Where immediacy suggests a unified visual space, contemporary hypermediacy offers a heterogeneous space, in which representation is conceived of not as a window on to the world, but rather as "windowed" itself—with windows that open on to other representations or other media. The logic of hypermediacy multiplies the signs of mediation and in this way tries to reproduce the rich sensorium of human experience. (pp. 33-34)
The logic of immediacy has perhaps been dominant in Western representation, at least from the Renaissance until the coming of modernism, while hypermediacy has often had to content itself with a secondary, if nonetheless important, status. Sometimes hypermediacy has adopted a playful or subversive attitude, both acknowledging and undercutting the desire for immediacy. At other times, the two logics have coexisted, even when the prevailing readings of art history have made it hard to appreciate their coexistence. At the end of the twentieth century, we are in a position to understand hypermediacy as immediacy's opposite number, an alter ego that has never been suppressed fully or for long periods of time. (p. 34)
In all its various forms, the logic of hypermediacy expresses the tension between regarding a visual space as mediated and as a "real" space that lies beyond mediation. Lanham (1993) calls this the tension between look at and looking through, and he sees it as a feature of twentieth-century art in general and now digital representation in particular. (p. 41)

Media Con(Media)tent


Again, we call the representation of one medium in another remediation, and we will argue that remediation is a defining characteristic of the new digital media. (p. 45)
The digital medium can be more aggressive in its remediation. It can try to refashion the older medium or media entirely, while still marking the presence of the older media and therefore maintaining a sense of multiplicity or hypermediacy. [ . . . ] This form of aggressive remediation throws into relief both the source and the target media. (p. 46)
Finally, the new medium can remediate by trying to absorb the older medium entirely, so that the discontinuities between the two are minimized. The very act of remediation, however, ensures that the older medium cannot be entirely effaced; the new medium remains dependent on the older one in acknowledged or unacknowledged ways. (p. 47)
[ . . . ] remediation operates in both directions: users of older media such as film and television can seek to appropriate and refashion digital graphics, just as digital graphics artists can refashion film and television. (p. 48)

What is New About New Media?


Our primary concern will be with visual technologies, such as computer graphics and the World Wide Web. We will argue that these new media are doing exactly what their predecessors have done: presenting themselves as refashioned and improved versions of other media. Digital visual media can best be understood through the ways in which they honor, rival, and revise linear-perspective painting, photography, film, television, and print. No medium today, and certainly no single media event, seems to do its cultural work in isolation from other media, any more than it works in isolation from other social and economic forces. What is new about new media comes from the particular ways in which they refashion older media and the ways in which older media refashion themselves to answer the challenges of new media. (pp. 14-15)

The Reality of Remediation


The process of remediation makes us aware that all media are at one level a "play of signs," which is a lesson that we take from poststructuralist literary theory. At the same time, this process insists on the real, effective presence of media in our culture. Media have the same claim to reality as more tangible cultural artifacts; photographs, films, and computer applications are as real as airplanes and buildings.
        Furthermore, media technologies constitute networks or hybrids that can be expressed in physical, social, aesthetic, and economic terms. Introducing a new media technology does not mean simply inventing new hardware and software, but rather fashioning (or refashioning) such a network. (p. 19)






SEED QUESTIONS - Please Post Comments Here


Q1. After reading Vannevar Bush’s “As We May Think” think about Bush as being considered the “father” of hypertext (although he did not coin the term). To what extent can we see his concept implemented in the World Wide Web that for many people defines their notion of hypertext? What are the differences?

Q2. Andries van Dam encourages us to approach hypertext as a new medium and not copy “old, bad habits.” What are some news ways to think about hypertext? How might we use hypertext in publishing, in writing, in thinking?


Q3. Joe Levy, in 1993 said: “if information is available, then any (authorised) person should be able to access it from anywhere in the world.”What implications does this thinking have to our own notions of publishing and the current online environment? You can use examples from your own experience.

5 comments:

  1. The question of the effect of open access to information on the publishing environment is of great interest to me. I write primarily fiction (short stories) and am always looking for sympathetic homes for my pieces where interested readers can enjoy them (hopefully). The traditional publishing model for short stories is highly discouraging to authors. Duotrope (www.duotrope.com) is a great resource for writers, as it provides details about markets, both online and print. One of the interesting statistics provided is the acceptance rate for each publication. (i.e. the number of pieces published compared to the number submitted.) For traditional print magazines, this rate is usually between 1% and 2%. In other words, 98% to 99% of the submitted pieces are rejected, usually with no feedback whatsoever. I never realized that the rejection rates were this high until I accessed Duotrope, as this information was not available through conventional sources.

    Rejection rates for online magazines are still high, but not as high as traditional magazines. This makes sense, as online magazines are not bound by the restrictions of page counts or finances. (There is still a cost to online magazines, but it is minimal compared to hard copy printing and distribution.) This trend is encouraging to writers, as new markets are constantly being opened and are gaining credibility.

    Despite the positive effect of more widely available literature, some literary types have questioned the quality of online magazines. This objection gets at the key issue of who are the content experts and how do they control the distribution of knowledge? There is a belief that because print magazines have limited resources that they must ration carefully, they are more serious and stringent in their assessment of quality. The belief is that editors of print magazines are the sole arbiters of literary quality. Some believe that the quality of online magazines can’t be as good because these magazines don’t have the same investment at stake.

    I believe this argument is not supportable. Although the internet has certainly provided an easily accessible distribution channel for bad writing, my experience has been that editors of online magazines are just as serious about the literary quality of the product as traditional editors. As well, I have found that editors of online magazines are more willing to work directly with the author, rather than just sending out a form rejection letter. I think this may be a result of the fact that traditional magazines have an infrastructure in place that protects and isolates the editors from the writers, whereas the connection in the online environment is more direct and instantaneous.

    I recently published a piece in Numero Cinq. This is really just Doug Glover’s blog, but the quality of the presentation is high, as is the quality of the writing. Visually, the blog has the look of a magazine, with the added feature that it is easy to navigate around to whatever area interests you. Doug has done a good job of cataloguing and organizing the contents. Doug clearly has the literary credentials (1993 Governor General’s Award for fiction) to edit this magazine, but instead of simply accepting submissions and then rejecting most of them, he commissions work directly. He found me through Facebook and asked for a piece of writing. He then worked with me through several revisions until the piece was ready to publish. His involvement and mentorship far exceeds anything I have ever experienced in a print magazine, and I believe he is able to do this because he does not have to deal with all the demands of traditional publishing infrastructure. As a result, more good writing that may have never been published is now available.

    ReplyDelete
  2. Michael Joyce’s idea of clustering leading to new rhetorical forms is probably the most interesting implication of his “constructive hypertexts.” His assertion that interaction with the text, which allows us to become the author and thus leads us to the discovery our own thought structures, is perhaps the greatest aspiration of digital age. Malloy quoting Hayles expresses the clustering idea somewhat differently by stating that “hypertext fiction offers narratives that operate as networks rather than linear sequences.” She goes on to describe the hypertext experience as “foraging for episodes” and asserts that “humans see the world through a process that is not as sequential as is often portrayed in works of fiction”. This assertion led me to consider the second seed question which suggests that hypertext shouldn’t copy “old, bad habits.” Are traditional narratives really full of old, bad habits?

    I think it is true that traditional storytelling follows a fairly linear model. This model has, of course, been played with and twisted innumerable times by many fine authors. Sometimes these experiments work, sometimes they don’t. I suffered through James Joyce’s Ulysses. Although I treasured his brilliance and attention to detail, it was mostly an appallingly dull and confusing read. Stream-of-consciousness novels try to capture non-sequential nature of the human mind, but the communication is still a uni-directional presentation of text. Most writers understand that although the human mind is very non-sequential, trying to replicate this process does not necessarily lead to a good story. Meaning requires structure, and in the traditional storytelling model, the structure is supplied by the author, through the selection of which details to present, which details to omit, the order of presentation, the figurative language used to present those details, etc.

    Malloy suggests that she is trying to put the reader in the narrator’s mind, and thus lead the reader to some understanding of the “associative and contingent flow of human thought.” Her experiments do this, but the reader still wants to acquire meaning from the experience. Her work could be described as poetry, although distinctions such as “poetry” and “prose” are less clear in the hypertext world, and are probably unimportant anyway. Traditionally, poetry and prose forms both try to convey meaning to the reader, although poetry tends to use the more associative and non-linear techniques.

    As I navigated my way through The Roar of Destiny, I was engaged by the process. There were complete sentences, but clicking on different fragments would lead to new sentences which assembled the existing fragments in different ways. This process really does force readers to construct their own meanings, as there is clearly no authorial intent to convey a single message. Sometimes the sentence fragments didn’t necessarily convey much meaning on their own, but certain associations or images would suggest themselves when the fragments were combined together. The different coloured blocks on the screen were also interesting, as people do have certain emotional responses to different colours.
    I enjoyed the experience of winding my way through The Roar of Destiny, but this experience is not the same as reading a good story. I can enjoy reading prose, poetry and hypertext, but I get a different experience from each. Hypertext fiction shouldn’t try to copy traditional storytelling methods, but it seems like the natural evolution of the concepts behind earlier stream-of-consciousness novels. I don’t think hypertext fiction will replace traditional narrative, but the possibilities suggested by the form should be explored.

    ReplyDelete
  3. The current online environment is littered with so much information, the biggest challenge has been cutting through it all to get what you want and need. Publishing to the internet has become such a simple process. Add the ability to publish from mobile phones and other devices, and you have an internet filled with data and information. Hyperlinks have worked to spread the information (good and bad) further and faster, leaving an enormous amount of pages, links and applications.

    This has really forced those that access the internet to be more active with the content presented to them. Instead of just consuming everything thrown at them, they have to be able to establish filters and methods to cut through the garbage. There is no manual or guidebook out there to tell people how to do this. Instead, it is up to them, using their skills, their experience and their personal traits (biases, preferences) to do the filtering. Silverman (2011) writes that the internet is critical to filtering information and should be taught in schools since “the Internet is the single greatest disseminator of bullshit ever created [and] is also the single greatest destroyer of bullshit.” Teaching would educate users on how to decide for themselves what is good and bad, and how to develop their own methods.



    Silverman, C. (2011). Bullshit Detection 101: Why universities need to teach the new literacy. Retrieved from http://www.regrettheerror.com/2011/01/20/bullshit-detection-101-why-universities-need-to-teach-the-new-literacy. (2011, January 21).

    ReplyDelete
  4. I agree with Sunil's comments that today's online world brings an added responsibility for the reader to make sure they validate the information that they receive or conduct their search for online information responsibly-using the right key words, referring to trusted websites etc.

    I wonder-with the added responsibility on the reader, are we taking away responsibility from the publisher/writer? With the ability and encouragement in today's online world to publish immediately without possible checks on sources, background information, etc, are we allowing for quality to be compromised?

    Technology allows everyone to become a publisher through blogs, smart phones and online forums, no longer needing an editor. Does this also result in a lack of quality: broken links, spelling mistakes, pictures that are either very low quality or in bad taste, outdated information and even inaccurate information floods the online world. Perhaps before we were able to hold the publisher of a newspaper or a publisher house of a book responsible for these mistakes, but now who is doing this when information is endless and the sources of this information are also endless.

    ReplyDelete
  5. I agree with Joe Levy, “if information is available, then any (authorised) person should be able to access it from anywhere in the world.” This comment is especially true in today's digital environment. Once someone posts something on their blog, facebook profile, twitter account, it is open for the general public to see so anyone across the world has access to this information.

    I agree with Sunil and TKB's comments of the implications this thinking has on our own notions of publishing and the current online environment. Everyone has access to be an "expert" now so be cautious of what you read and where the source of information is coming from. I had a personal incident on Facebook where I thought a family member had a severe heart attack. It started from my cousin posting "my mom has been rushed to the hospital". The information flowed through different family members so by the time I saw the updates, someone posted "She had a major heart attack". The correct information was lost somewhere in the series of "updates". This is one of many examples that I could use.

    Quality will be comprised but it is also our responsibility to check the resourses and research more than one source.

    ReplyDelete