Skip to Main Content
Have library access? Log in through your library
Reading Machines

Reading Machines: Toward an Algorithmic Criticism

Copyright Date: 2011
Pages: 112
  • Cite this Item
  • Book Info
    Reading Machines
    Book Description:

    Besides familiar and now-commonplace tasks that computers do all the time, what else are they capable of? Stephen Ramsay's intriguing study of computational text analysis examines how computers can be used as "reading machines" to open up entirely new possibilities for literary critics. Computer-based text analysis has been employed for the past several decades as a way of searching, collating, and indexing texts. Despite this, the digital revolution has not penetrated the core activity of literary studies: interpretive analysis of written texts._x000B__x000B_Computers can handle vast amounts of data, allowing for the comparison of texts in ways that were previously too overwhelming for individuals, but they may also assist in enhancing the entirely necessary role of subjectivity in critical interpretation. Reading Machines discusses the importance of this new form of text analysis conducted with the assistance of computers. Ramsay suggests that the rigidity of computation can be enlisted by intuition, subjectivity, and play.

    eISBN: 978-0-252-09344-9
    Subjects: Technology, Language & Literature

Table of Contents

Export Selected Citations Export to NoodleTools Export to RefWorks Export to EasyBib Export a RIS file (For EndNote, ProCite, Reference Manager, Zotero, Mendeley...) Export a Text file (For BibTex)
  1. Front Matter
    (pp. i-vi)
  2. Table of Contents
    (pp. vii-viii)
    (pp. ix-xii)
    (pp. 1-17)

    Digital humanities, like most fields of scholarly inquiry, constituted itself through a long accretion of revolutionary insight, territorial rivalry, paradigmatic rupture, and social convergence. But the field is unusual in that it has often pointed both to a founder and to a moment of creation. The founder is Roberto Busa, an Italian Jesuit priest who in the late 1940s undertook the production of an automatically generated concordance to the works of Thomas Aquinas using a computer. The founding moment was the creation of a radically transformed, reordered, disassembled, and reassembled version of one of the world’s most influential philosophies:


    (pp. 18-31)

    The word “algorithm” is an odd neologism. Most scholars now believe that the word relates back to the word “algorism,” which is in turn a corruption of the name of the Persian mathematician al-Kwārizmī from whose book,Kitāb al-jabr wa’l-muqābala(“Rules for Restoring and Equating”), we get the word “algebra” (Knuth 1). Throughout its varied history, the term has more or less always borne the connotation of a method for solving some problem, and, as the early slide from sibilant to aspirate would imply, that problem was most often considered mathematical in nature. During the twentieth century, however, the word...

    (pp. 32-57)

    “Algorithmic criticism”—the term I use to designate a reconceived computer-assisted literary criticism—shares with Oulipo a desire to use the narrowing forces of constraint to enable the liberating visions of potentiality. Its medium is the computer, but it looks neither to the bare calculating facilities of the mechanism nor to the promise of machine intelligence for its inspiration. Instead, algorithmic criticism attempts to employ the rigid, inexorable, uncompromising logic of algorithmic transformation as the constraint under which critical vision may flourish. The hermeneutic proposed by algorithmic criticism does not oppose the practice of conventional critical reading, but instead attempts...

    (pp. 58-68)

    Even scholars working far outside the disciplines that make up the field of artificial intelligence are familiar with the basic elements of the Turing test, in which the machine’s ability to mimic human language is presented as the touchstone of intelligent behavior. It is usually presented in the following way: A human being and an entity that is either a human being or a machine are separated from each other by a wall. The first human is allowed to pose questions to the unseen entity by means of a teleprompter. If that human being is unable to determine from the...

    (pp. 69-82)

    A few years ago, Martin Mueller, the animating force behind the text analysis systemWordHoard, decided to perform what we might call an experiment but would better be thought of as the fulfillment of a brief moment of curiosity. Using the system’s powerful word-counting and lemmatization features, Mueller was able to create lists of the most frequent words in Homer and Shakespeare:

    I call attention to the spur-of-the-moment character of his investigations, not to suggest that one could not conduct elaborate experiments involving word frequency (many have done so), but simply to point out the ways in which this operation...

    (pp. 83-86)
  10. NOTES
    (pp. 87-90)
    (pp. 91-94)
  12. INDEX
    (pp. 95-98)
  13. Back Matter
    (pp. 99-100)