Video Games and Paratextuality

About 40 years ago, the French narratologist Gérard Genette introduced the term seuils—French for thresholds, translated into English as paratexts—to describe the many ancillary texts that may accompany a book’s publication, from cover images to authorial prefaces to reviews. Authors, publishers, printers, patrons, readers, and others involved in the making and reception of books have long been familiar with paratexts, especially in the early modern period, when printed books often appeared with complex configurations of prefaces, epistles to readers, appeals to patrons, errata sheets, and printed commentaries. By giving a single name to this category of texts, and by broadening it to include independently circulating materials such as author interviews and critical reviews, Genette (and his English translator, Jane E. Lewin) gave us a word to conjure with.

Today, the term paratext has travelled beyond literary studies and book history into fields that study other kinds of media, including film and television studies (see Jonathan Gray’s Show Sold Separately: Promos, Spoilers, and Other Media Paratexts; NYU Press, 2010), and videogame studies via Mia Conalvo’s Cheating: Gaining Advantage in Video Games (MIT Press, 2007) and Steven E. Jones’s The Meaning of Video Games: Gaming and Textual Strategies (Routledge, 2008). The concept of paratext has proven especially fruitful for understanding videogames, as Jan Švelch has recently demonstrated in a survey of the term’s takeup by the field.

My own contribution to this conversation among fields has taken the form of a recent special issue for the journal Games and Culture, on the topic “Video Games and Paratextuality”:

  1. Alan Galey, “Introduction: Reconsidering Paratext as a Received Concept”
  2. Regina Seiwald, “Beyond the Game Itself: the Paratextuality of Video Games”
  3. Jon Saklofske, “To the Center of Nowhere: Deep Mapping Digital Games’ Paratextual Geographies”
  4. Alan Galey, “Behind the Scenes at ApertureScience.com: Portal and Its Paratexts”
  5. Steven E. Jones, “Response: In and Out of the Game, as Usual”

(Most of the articles are available open access via the journal directly, but in a couple of cases I’ve linked to versions on the authors’ institutional repositories. Unfortunately the journal made an error and published some of these articles in different issues; see the individual articles linked from this page for the correct citation details.)

This special issue grew out of a panel I organized for the 2021 conference of the Canadian Game Studies Association. Our collaboration, however, began a year earlier in 2020, partly as a way to maintain connections with colleagues during the isolation of the COVID-19 pandemic, especially in its early months. As with many pandemic projects, it took some time to finish the articles, and there were disruptions along the way, but I am very happy about the connections we made. Each of the contributions shows how concepts of textual scholarship translate to the study of videogames, but also how videogames require us to modify, rethink, and sometimes abandon ideas that may be too book-centric in their original framing. In my introduction I wrestled with Genette’s Eurocentric and presentist biases in particular, but we all drew inspiration from Steve’s 2008 book The Meaning of Video Games, which helpfully conceives of videogames as texts, in the broadest sense of that term, and not merely as analogues or competitors for books. (Though there’s definitely a need for work on relationships between books and videogames, including books in videogames; see this recent article on “High Fantasy RPGs and the Materiality of the Medieval Book” by Bard Swallow, one of my students in the Book History and Print Culture program, also in Games and Culture.)

My own article focuses on an especially weird and fun paratext for the classic videogame Portal, a first-person platform/puzzle-solving game set in the fictional Aperture Science Laboratories. In this game, the player must navigate their way through an underground research facility that is strangely empty of other humans, accompanied only by the voice of GLaDOS, a passive-aggressive rogue AI fixated on carrying out Aperture Science’s research program even after some hinted-at global catastrophe—which ties into the Half-Life games from the same publisher, Valve. As I unpack in the article, Aperture Science is a finely tuned satire of postwar big-science research, with GLaDOS as its insane AI genuis loci (written by Erik Wolpaw and Chet Faliszek, and voiced by Ellen McLain in one of the greatest videogame voice-acting performances of all time, which continues in Portal 2).

The paratext I take as a case study is the tie-in website ApertureScience.com, launched in 2006 to promote the game:


As you can see in this screenshot, this isn’t your typical promotional website. ApertureScience.com used Flash to simulate a blinking cursor and command prompt in the style of the DOS operating system (hence GLaDOS’s name), and visitors could enter commands once they logged into the system. For many videogame players of my generation, a blinking green (or maybe amber) command line was how we first learned to interact with computer systems, and it remains a visual trope in depictions of computing today. ApertureScience.com simulated the experience of using one of the lab’s own computers, which ran the GLaDOS operating system (apparently v1.07 from 1982), and allowed users to explore files, discover secrets, and even fill out a very strange application questionnaire if they could work out the right commands.

You may have noticed my use of past tense in the paragraph above. ApertureScience.com’s command-line simulation was implemented in Flash, which means the site ceased to work in most web browsers after 2021. Even before then, however, ApertureScience.com had been updated a few times, and the command prompt was replaced with a Christmas-themed video in 2010. For many years, the only way to access the earlier, interactive versions of ApertureScience.com was via the Internet Archive’s Wayback Machine. (There was a period after 2021 when the site wouldn’t work on the Wayback Machine either, but it works now.) The site’s history has been thoroughly documented on the Half-Life wiki on Fandom.com, and I’ll return to the question of ApertureScience.com as an artifact of internet history below.

But before I say any more about how we access this videogame paratext, there’s the bigger question of why? As I explain in my article, ApertureScience.com is an in-universe paratext for Portal, meaning that it adds to Portal‘s story and fleshes out its world. Works of fanfiction do this all the time, but ApertureScience.com is not fanfiction; it was created by Valve and the game’s writers, and is considered part of Portal‘s canon. One can play Portal the videogame without experiencing the website—in any of its multiple versions—but canonical paratexts like this raise intriguing questions about where Portal‘s edges as a creative work begin and end. For example, in the game itself, one may do some optional exploring and discover these login credentials written on a wall in a supposedly inaccessible part of the research facility:


The login name “cjohnson” and password “tier3” next to the words “trust me” can be used at ApertureScience.com to log in as Aperture Science’s eccentric founder, Cave Johnson (who appears as a character in the 2011 sequel, Portal 2, voiced to perfection by the actor J.K. Simmons). Players who never discover this easter egg can still log in using any username and the easily guessable password “portal” or “portals,” and can open the “apply.exe” file to fill out the bizarre application form—itself a subtle world-building text which tells the reader as much about the organization asking the questions as it does about anything else. But logging in to ApertureScience.com as Cave Johnson gives us access to a second, secret file:


Opening the “notes.exe” file from the list above grants access to a brief historical timeline of Aperture Science Laboratories, leading up to the events that initiate Portal. The whole thing is a fun read; here’s just the first screen:


Like most easter eggs, all of this material is deliberately hard to access, but rewarding for diligent players once they find it. What I like most about this easter egg, however, is not only the background and nuance it adds to an intentionally minimalist story, but also the fact that ApertureScience.com is a distinct digital artifact in its own right while also part of Portal as a creative work that spans multiple artifacts—not to mention multiple versions, which is one of the preoccupations of textual scholarship no matter the form of the text in question, whether manuscript, print, born-digital, or otherwise. What do paratexts like ApertureScience.com mean for the study of videogames as cultural texts and historical artifacts, and for their preservation, archiving, and interpretation in the future? These are questions I consider in my article, and which my co-contributors to the special issue, Regina, Jon, and Steve, unpack along with questions of their own as well.

As I suggest in the article—building on a line of inquiry I’ve pursued in an earlier article on The Tragically Hip and pro-am music archiving—we can learn much about these questions from the work of dedicated fan communities and the resources they create, like the entry for ApertureScience.com on the Half-Life Fandom.com wiki. As I was working on this article, a similar fan-driven project at the Valve Archive was creating an emulated version of ApertureScience.com (in all four of its distinct versions) so that players could have access to this otherwise ephemeral part of videogame history. If you’re intrigued by the description of ApertureScience.com above and in my article, I recommend checking out their version—armed with the secret commands documented on the wiki.

Now that the Valve Archive has created what amounts to a facsimile of the original site, my next project may take the form of a scholarly critical edition of ApertureScience.com—i.e. an accessible version which includes commentary to help the reader, indicates of the variants between the versions, and an analysis of the original Flash code for the site. My PhD student Ellen Forget and I have begun work on this, using the Twine platform, and I’ll post an update down the road.

History and Design: The Past and Potential of Video Games in Interdisciplinary Scholarship

On Monday, September 18, my colleague Velian Pandeliev and I will be leading the first event in the Faculty of Information’s Crosstalk series: a set of informal public lectures organized by Matt Ratto, offered in connection with our PhD program, and designed to bring iSchool faculty from different disciplines together to discuss a shared topic. Our topic is video games, which I approach from an historical perspective and Velian approaches from a design perspective, though we share many of the same interests. Like other cultural artifacts, video games cannot be understood from any single disciplinary perspective. A designer may see an opportunity to create a tool that influences the world around them, an historian may see an artifact that embodies the forces that shaped it, and players inevitably make their own meanings. In this crosstalk, we will consider how historical and design approaches both complement and challenge each other, and what we can learn about multi-/interdisciplinary scholarship as we shift our own perspectives.

Details for this and other events in the Crosstalk series can be found in the PDF posted above. It’s open to the public, but the organizers are asking that attendees register in advance: https://forms.office.com/r/NXKuDEGH9k.

Ahead of the talk, the PhD students will be reading a couple of our recent publications, which I’ll share here for anyone else who might be interested:

During the talk, we’ll be mentioning some links which I’ve gathered together here for convenience:

2022 Warren Lecture: A Bibliographical Disturbance: Teaching and Learning in Book History After 2020

On April 26 I was delighted to give the 19th Frederic Alden Warren Lecture for the John W. Graham Library at Trinity College, University of Toronto, titled “A Bibliographical Disturbance: Teaching and Learning in Book History After 2020.” For a list of references and links to some of the materials I discuss in the talk, scroll down below the poster.

Abstract:

When something disrupts the normal process of making a book, the disruption often leaves a material trace which textual scholars call a “bibliographical disturbance.” The year 2020 will long be remembered as a similar kind of disruption writ large, leaving its own material traces in our scholarship, careers, and lives. For the Book History & Print Culture (BHPC) program at the University of Toronto, 2020 also happened to be its twentieth anniversary as a graduate program. What should have been a year of celebration instead became a year of adaptation, as the COVID-19 pandemic forced us to rethink BHPC’s normally library-based, book-focused courses for remote delivery. BHPC’s twentieth anniversary became an occasion to re-examine the field’s rationale and pedagogy—just as bibliographical disturbances are opportunities to understand a book’s structure and nature.

In that spirit, this talk will reflect on lessons learned about book history education during the pandemic. From the representation of physical books on digital screens, to the status of born-digital literature, to the social value of the book arts, to questions about diversity and equity in the field of book history—2020 brought a reckoning with all these topics and more. Yet book history education has never been more necessary than today, and textual scholarship has important work to do in the post-2020 world. This talk will look back on what we’ve learned from twenty years of book history at the University of Toronto, and will look ahead to the next twenty.

Here are some footnotes to certain points in the talk where I referenced a specific book or other source. I’ve organized them here in the sequence they appear in the three sections of the talk.

Part 1: Bibliographical Disturbances

  1. The lightning bolt/tree image from the poster and video is a wood engraving made by Canadian artist and filmmaker Laurence Hyde for a planned edition of Shakespeare’s Macbeth from Golden Dog Press, a private press founded by J. Kemp Waldie, many of whose books and records are held by the Graham Library at Trinity College (including the 1497 Dante book I showed in the lecture). The Golden Dog Press edition of Macbeth never came to pass, but a copy of Hyde’s Engravings for Macbeth (ca. 1939) is held by the Fisher Rare Book Library.
  2. The concept of bibliographical disturbances is one that David Greetham discusses at several points in his work; in particular, see his chapter “Slips and Errors in Textual Criticism” in Textual Transgressions: Essays Toward the Construction of a Bibliography (New York: Routledge, 2011), 349–356.
  3. The example of a bibliographical disturbance in which the hair- and flesh-sides of parchment leaves face each other in an opening is from a copy of William de Wycumbe,Vita venerabilis Roberti Herefordensis episcopi (Llantony Secunda: ca. 1200), held by the Fisher Library. I discuss it in more detail in this entry on openings in books for the project Architectures of the Book.
  4. The Dante book is an early edition of the Divine Comedy held by the Graham Library at Trinity College (and formerly owned by J. Kemp Waldie of Golden Dog Press), Danthe alighieri fiorentino (Venice, 1497). On the reading and censorship of Dante generally, and on the paratextual commentary by Christophoro Landino, which accompanied Dante’s poem on the page in this and many other editions, see Simon A. Gilson, Reading Dante in Renaissance Italy: Florence, Venice, and the ‘Divine’ Poet (Cambridge University Press, 2018). I compared the censored Dante book with images from this story from CNN from April 22 about Florida’s censorship of math textbooks.
  5. For a digital facsimile of the copy of the 1623 Shakespeare First Folio discussed in the lecture, see the Internet Shakespeare Editions (the link should take you straight to Troilus and Cressida). For more information on the First Folio, including details on reading collation statements and understanding the Troilus and Cressida anomalies, see the Folger Shakespeare Library’s website. On the use of digital First Folio facsimiles, see Sarah Werner’s website (this page links to her excellent chapter in the Cambridge Companion to Shakespeare’s First Folio). I have also written about the First Folio’s bibliographical features in chapter 3 of my book The Shakespearean Archive: Experiments in New Media from the Renaissance to Postmodernity (Cambridge University Press, 2014), and in an article co-authored with Rebecca Niles, “Moving Parts: Digital Modeling and the Infrastructures of Shakespeare Editing,” Shakespeare Quarterly 68, no. 1 (2017): 21–55.
  6. The tree rings image comes from a blog post by Sarah Nason and Sonya Odsen, “A Wildfire Story: Decoding the Past with Tree Scars,” for the Landscapes in Motion project (April 9, 2019). Note their use of the word disturbances to describe environmental events that leave material traces in trees as living organisms.

Part 2: Teaching and Learning in the BHPC Program, 2020–2022

  1. For more about the Book History & Print Culture collaborative program at the University of Toronto, see bhpctoronto.com.
  2. On book-making kits as an emergent genre during the pandemic, see Leah Price’s talk “Book Leaning, Hands-Off?” for Princeton University’s 2020 colloquium on “The Virtual Materiality of Texts: Book History During a Pandemic” (and see the other talks as well). See also Shannon Mattern’s essay “Unboxing the Toolkit” on Tool-Shed.org (July 9, 2021).
  3. For more about Toronto’s Paperhouse Studio, see paperhousestudio.com.
  4. The William Caxton book shown in one of the slides (being examined by a student with a magnifying glass) contains an English translation of two essays by Cicero, De Amicitia (“On Friendship”) and De Senectute (“On Old Age”), and one by Giovane Buonaccorso da Montemagno, De Nobilitate (“On Nobility”), and was printed by Caxton in Westminster in 1481, making it the first English translation of classical humanist texts and presently the oldest printed book in Canada. For details, see the library catalogue entry linked above and this media release on the book’s acquisition in by the Fisher Library.
  5. The box of artists’ books shown in one of the slides is Millennium in a Box: a Portfolio Collection of Work by Book Artists from Across Canada, from the Canadian Bookbinders and Book Artists Guild.
  6. There has been much bibliographical writing that deals with the field’s need to engage with the world outside the reading room windows, so to speak. For some recent examples, see Matthew G. Kirschenbaum, Bitstreams: the Future of Digital Literary Heritage (Philadelphia: University of Pennsylvania Press, 2021), particularly the conclusion; Matt Cohen, “Time and the Bibliographer: a Meditation on the Spirit of Book Studies,” Textual Cultures 13, no. 1 (2020): 179–206; Kate Ozment, “Rationale for Feminist Bibliography,” Textual Cultures 13, no. 1 (2020): 149–178; Michelle R. Warren, Digital Holy Grail: a Medieval Book on the Internet (Stanford University Press, 2022); Whitney Trettien, Cut/Copy/Paste: Fragments from the History of Bookwork (Minneapolis: University of Minnesota Press, 2021); Brigitte Fielder and Jonathan Senchyne, eds., Against a Sharp White Background: Infrastructures of African-American Print (Madison: University of Wisconsin Press, 2019).

Part 3: Archives and the Future

  1. On the need to understand archives beyond metaphors, see Michelle Caswell, “‘The Archive’ is Not an Archives: Acknowledging the Intellectual Contributions of Archival Studies,” Reconstruction 16, no. 1 (2016). I have also written about the need for textual and literary scholars to engage with archival studies (i.e. the scholarly literature of actual archivists) in chapter 2 of my book The Shakespearean Archive (Cambridge University Press, 2014) and more recently in “The Work and the Listener,” Textual Cultures 14, no. 1 (2021): 50–64.
  2. The Fisher Library has an extensive and detailed set of finding aids for Margaret Atwood’s literary manuscripts, typescripts, and other archival materials. At present (April, 2022) some of Atwood’s annotated typescripts from her writing of The Handmaid’s Tale are on display in the Fisher’s exhibition space along with other items of interest from the collections.
  3. On Marshall McLuhan’s annotations in his copies of James Joyce’s books (especially Ulysses), see my blog post and two articles, “Reading McLuhan Reading Ulysses, in “Many McLuhans or None at All,” ed. Sarah Sharma, special issue, Canadian Journal of Communication 44, no. 4 (2019): 503-26; and “Imagining Marshall McLuhan as a Digital Reader: an Experiment in Applied Joyce,” in “Reading McLuhan Reading,” ed. Paula McDowell, special issue, Textual Practice 35, no. 9 (2021): 1525–49. For more on McLuhan’s library, see his grandson Andrew McLuhan’s blog, Inscriptorium, and his essay in the Textual Practice special issue. The Fisher Library has detailed finding aids for McLuhan’s library and archival materials.
  4. Near the end of the talk I mentioned two initiatives worth supporting. One is the Children’s Book Bank, which makes books and literacy support available to children and families in high-needs areas in the Toronto area. The other is the Saving Ukrainian Cultural Heritage Online (SUCHO) project, which is working to preserve websites and other forms of digitial cultural heritage in Ukraine that have been put at risk by the Russian invasion.
  5. The image on the final slide is from the Wikipedia page on the Birnam Oak, believed to be one of the last remaining trees from Birnam Wood, which plays a key role in Shakespeare’s Macbeth.

Pleasures of Contamination: David C. Greetham’s Influence on Textual Scholarship, Past and Future @ MLA 2021

David Greetham’s work is a big influence on my Veil of Code project, and he was someone who inspired me to become interested in textual scholarship many years ago. I was saddened to learn of his passing in spring 2020, just as the COVID-19 pandemic was reaching North America. As a tribute to David, and to draw attention to the continuing value of his work, Kathy Harris and I organized the roundtable described below for the 2021 Modern Language Association conference. You can also find this proposal posted on the blog of the MLA’s Committee on Scholarly Editions, who co-sponsored this panel along with the Society for Textual Scholarship. I am very grateful to these co-sponsors, and especially to Kathy and our other panellists, Paul Eggert, Amanda Licastro, Sarah Lubelski, and Jerome McGann, for enabling us to pull this roundtable together on very short notice, and during a time of great disruption and uncertainty.

Illustration of the roundtable by Jojo Karlin (jojokarlin.com @jojokarlin), used with permission.

__________________

For this MLA Convention 2021, the MLA Committee on Scholarly Editions and Society for Textual Scholarship are co-sponsoring a roundtable discussion that considers the work of textual scholar, David Greetham, and his consistent focus on the textuality of all kinds of cultural works—their entanglement of meaning, intention, and materiality—as the unifying idea in a body of work whose diversity runs counter to textual scholarship’s tendency to specialize by period, national tradition, or medium. With a turn toward the future of the humanities, aside from brief statements, the focus will be on discussion.

Session Description:

David C. Greetham changed the way we understand textual scholarship in the English-speaking world. As the author of provocative and erudite books and articles, as one of the founders of the Society for Textual Scholarship, as a former member of the MLA Committee on Scholarly Editions, and as a generous teacher and mentor to his students and peers alike, he set the stage for textual studies’ entry into the twenty-first century. With his passing in March 2020, textual studies and its related fields have occasion not only to remember his contributions, but also to take up Greetham’s many intellectual provocations to think radically and adventurously about the role of textual studies within the humanities, broadly speaking. As he argues in his final book, The Pleasures of Contamination: Evidence, Text, and Voice in Textual Studies, the notion of textual critics and editors as guardians of textual purity is less productive than its alternative; instead, he suggests, “contamination may be seen as normative, healthy, and necessary: a textual (and human) condition to be celebrated rather than condemned” (3-4). In his sometimes irreverent but always deeply learned studies of the production, transmission, and reception of texts, Greetham gave us forms of textual scholarship for a messy, imperfect world. 

This roundtable looks back on Greetham’s ideas, but also forward to their evolving relevance. All of the roundtable participants share Greetham’s enthusiasm for the idea that the concept of text is the locus of all discussions, scholarship, and pedagogy, whether literature, architecture, tweets, paintings, music, or anything of potential cultural value in the twenty-first century. Greetham’s consistent focus on the textuality of all kinds of cultural works—their entanglement of meaning, intention, and materiality—is the unifying idea in a body of work whose diversity runs counter to textual scholarship’s tendency to specialize by period, national tradition, or medium. His fearless curiosity and willingness to cross disciplinary boundaries are qualities that the humanities will need in the twenty-first century. 

The intended audience for this roundtable will obviously include those who knew Greetham’s work and shared its influence, though we are equally committed to reaching audiences who are new to textual scholarship—especially early-career scholars and those who do not self-identify with textual studies as a field. 

Consider the following questions for discussion during our roundtable (we invite a robust conversation with the audience and among panelists): 

  • Greetham always considered “teaching and lecturing as branches of the entertainment industry . . . and generally performed with energy and a good dramatic sense of the occasion” (Textual Transgressions 44). With this goal, he managed to engage his graduate students in robust intellectual conversation that allowed for their individual explorations. How can we use his capacious strategy in our own teaching and mentoring?
  • Given that many graduate programs are struggling to balance breadth and specialization, how was Greetham able to embrace the breadth of textual studies and open it out to other areas like digital humanities and media studies? How has the kind of outreach he practiced been received and reciprocated in these and other fields?
  • In Greetham’s Theories of the Text, The Margins of the Text, and Textual Transgressions, he took great pleasure in this idea of “contamination” or (as Derrida says) embracing “archive fever.” How does Greetham’s idea of contamination speak to the evolving nature of multi-modal or multimedia texts?
  • Twenty years ago in Theories of the Text, Greetham explored different branches of critical theory to establish the broad relevance of the concept of text. In similar fashion, throughout his work he treats a remarkably broad range of cultural artifacts and media types as texts, from arias to remixed cartoons to word processor files. In Greetham’s work, the concept of text goes hand-in-hand with a radical commitment to breadth. What, then, are the advantages and limits of the term text? Does the word resonate differently now? 

The presider and roundtable participants represent a range of Greetham’s peers (Eggert, McGann), former students (Harris, Licastro), and other scholars who engage with his ideas in their own work (Lubelski, Galey). The participants represent a range of career stages and critical approaches, as well. Aside from brief three- minute opening statements, there will be no position papers or formal presentations, and the focus will be on discussion and audience participation, guided by the questions above. 

Organizer & Presider: Alan Galey (alan.galey@utoronto.ca) U of Toronto  

Expertise and Scholarship: Alan Galey has published widely in textual studies and related fields, and his research has been recognized by awards from the Society for Textual Scholarship and the Association of College and Research Libraries. He is director of the graduate program in Book History & Print Culture at the University of Toronto. His current book project, The Veil of Code: Studies in Born- Digital Bibliography, draws on the spirit and substance of David Greetham’s work by extending the theories, methods, and mindsets of textual scholarship to the study of digital materials, from ebooks to digital music to videogames (www.veilofcode.ca). 

Organizer & Speaker: Katherine D. Harris (katherine.harris@sjsu.edu) San Jose SU 

Expertise and Scholarship: Katherine D. Harris, Professor of Literature & Digital Humanities, has published widely in textual studies, history of the book, bibliography, digital humanities, and digital pedagogy. With David C. Greetham as her dissertation advisor and mentor, she went on to publish Forget Me Not: The Rise of the British Literary Annual 1823-1835, The Forgotten Gothic: Short Stories from British Annuals 1823-1831, and Digital Pedagogy in the Humanities. Harris’ mixing of textual studies and literary criticism in both her scholarship and pedagogy reflect Greetham’s capacious style of mentoring and his prescient anticipation of the digital humanities field.

Speaker: Paul Eggert (pauleggert7@gmail.com) Loyola U Chicago (appearing remotely)

Expertise and Scholarship: Paul Eggert held the Svaglic Chair in textual studies at Loyola University Chicago, where he is now Professor Emeritus. From the late 1980s he found David Greetham’s shameless boundary hopping an inspiring example. A lot of editing and general editing intervened before Eggert’s theories of the editorial act – in some ways a reply to Greetham’s centralising of the text concept – were expressed in The Work and the Reader in Literary Studies: Scholarly Editing and Book History (2019). This followed Biography of a Book (2013), and the award-winning monograph Securing the Past: Conservation in Art, Architecture and Literature (2009). 

Speaker: Amanda Licastro (amanda.licastro@gmail.com) Stevenson U 

Expertise and Scholarship: Amanda Licastro is Assistant Professor of Digital Rhetoric at Stevenson University, and has published on writing studies, digital media, and pedagogy. Reading about David Greetham’s course in Jerome McGann’s essay was one pivotal reason Amanda applied to do her doctoral studies at CUNY’s Graduate Center. Upon admission, Amanda took two courses with David before he became a member of her orals and dissertation committees. Her dissertation, which won the Calder Prize in Digital Humanities, and the subsequent publications “The Problem of Multimodality,” and “The Past, Present, and Future of Social Annotation,” are heavily influenced by David’s guidance and scholarship. 

Speaker: Sarah Lubelski (sarah.lubelski@mail.utoronto.ca) Ryerson University 

Expertise and Scholarship: Sarah Lubelski holds a Postdoctoral Fellowship from the Social Sciences and Humanities Research Council of Canada at the English Department at Ryerson University. Her PhD thesis, defended in 2019, is titled A Gentlewoman’s Profession: The Emergence of Feminized Publishing at Richard Bentley and Son, 1858-1898, and was the recipient of the iSchools Doctoral Dissertation Award. Her postdoctoral research on gender and publishing intersects with David Greetham’s writing on feminism and textual studies, and on questions of power and identity in editing and publishing work. As an early-career interdisciplinary scholar, she brings a vital perspective on the future of textual scholarship. 

Speaker: Jerome J. McGann (jjm2f@virginia.edu) U of Virginia (appearing remotely)

Expertise and Scholarship: Jerome McGann is an award-winning literary and textual scholar, and one of the most influential figures in these fields. His most recent books are The Poet Edgar Allan Poe: Alien Angel and A New Republic of Letters: Memory and Scholarship in the Age of Digital Reproduction. His work has been in dialogue with David Greetham’s since the early days of the Society for Textual Scholarship. Like Greetham’s, his work established an early and influential link between textual scholarship and what is now called digital humanities, and has consistently focused on the importance of philological thinking to the humanities generally.

Imagining Marshall McLuhan as a Digital Reader

Alan Galey, “Reading McLuhan Reading Ulysses,” in “Many McLuhans or None at All,” ed. Sarah Sharma, special issue, Canadian Journal of Communication 44, no. 4 (2019): 503-26, [open-access version: http://hdl.handle.net/1807/99516]

Alan Galey, “Imagining Marshall McLuhan as a Digital Reader: an Experiment in Applied Joyce,” in “Reading McLuhan Reading,” ed. Paula McDowell, special issue, Textual Practice 35, no. 9 (2021): 1525–49 [open-access version: https://hdl.handle.net/1807/108256]

On March 5-6 I was part of the Reading McLuhuan Reading symposium at NYU, a follow-up to the Many McLuhans symposium held at the University of Toronto’s Thomas Fisher Rare Book Library in September 2018. Organized by NYU’s Paula McDowell, the New York symposium was a fascinating intersection between book historians, literary scholars, intellectual historians, and others working in the space that McLuhan helped to create for those who care about literature and media alike.

RMR Poster.jpg

I’m not a McLuhanite by any stretch, and like many book historians I struggle with his elliptical prose (at least in his books), his loose treatment of historical evidence, and his too-easy generalizations. Today, reading a book like The Gutenberg Galaxy as a factual guide to the history of print would be like a medical student working from a textbook written prior to the discovery of viruses or DNA. What, then, can a modern-day book historian do with McLuhan? For years I mostly bypassed his work and introduced my students to more thorough and reliable (and no less exciting) scholarship like Adrian Johns’s The Nature of the Book: Print and Knowledge in the Making. When I did bring McLuhan’s work into courses like Introduction to Culture & Technology, I would accompany it with a stern warning to my students: “don’t write like this!” As John Durham Peters suggests, “one reads McLuhan for sparks, not scholarship” (The Marvelous Clouds: Toward a Philosophy of Elemental Media, p. 17). Similarly, Andrew McLuhan’s talk at the NYU symposium emphasized other reasons for reading his grandfather’s work today, particularly the idea that it helps us learn to read a medium the way we would read a poem.

As I’ve been discovering, a good place for a book historian to start with McLuhan is his own books—in the literal sense of his personal library, which is now housed at the Fisher Library in Toronto. The McLuhan of Gutenberg Galaxy drives me mildly nuts, but the McLuhan who read so broadly and deeply, and who annotated his own books so carefully and systematically, is a reader I now approach with real fascination.

McLuhan’s library waits at the Fisher for researchers to explore it, and anyone can start with the online finding aid, and with the blog Inscriptorium, created by Andrew McLuhan, who catalogued his grandfather’s library. His blog is rich with examples and images of McLuhan’s annotation practices, and gives a good sense of the working practices of a scholar who used his books to create a textual workshop.

In 2018 I was asked to speak at the Many McLuhans symposium because I study reader marginalia and regularly give my book history students an “Annotating Reader Profile” assignment. That talk led to a contribution to a special issue of the Canadian Journal of Communication edited by my colleague Sarah Sharma, appropriately titled “Many McLuhans or None at All.” My article, “Reading McLuhan Reading Ulysses,” is a case study in McLuhan’s annotations in his copies of the James Joyce novel. Why not take his favorite Joyce novel, Finnegans Wake, as my case study? My article offers a few reasons for looking at Ulysses instead, but the key reason is that I’m hoping a more serious Joyce/McLuhan scholar will take on that substantial task. You can find an open-access version of this article and its sequel via links in the citations at the top of this post.

What, then, was McLuhan like as a reader? While his own published books are often maddeningly light on notes and sources (again, contrast Johns’s The Nature of the Book), his library reveals that he was a careful and systematic reader, who annotated many of his books so that they’d function like tools in a workshop. In the article I look in detail at two of the four editions of James Joyce’s Ulysses included in the McLuhan collection at the Fisher. This image shows McLuhan’s heavily annotated copy of the two-volume Odyssey Press edition, which McLuhan owned during his student days at Cambridge, where he studied with F.R. Leavis and other great literary scholars of the day. (Odyssey Press was an imprint of Albatross Press, a remarkable German publisher who put out edgy modern literature under the nose of the Nazi regime; see Michele K. Troy’s excellent recent study, Strange Bird: The Albatross Press and the Third Reich.)

McLuhan Odyssey Press Ulysses

McLuhan’s student edition of James Joyce’s Ulysses (Odyssey Press, 1933), held at the University of Toronto’s Fisher Rare Book Library

McLuhan annotated his Odyssey Press Ulysses volumes copiously. He marked up interesting passages, such as the notoriously difficult opening to the episode Proteus (“Ineluctable modality of the visible…”). On the blank flyleaves, he copied long extracts from critical works by T.S. Eliot, Wyndham Lewis, and others. McLuhan’s copies even have homemade dustjackets covered in annotations, visible in the image above. (These might have served a second purpose: concealing the fact that he was reading a novel still banned in England when we was at Cambridge. His own literature professor F.R. Leavis ran into trouble with the authorities when he attempted to assign Ulysses to his students in the 1920’s.) In many other books, including his copy of the 1961 Random House edition of Ulysses, McLuhan also used the back flyleaves and endpapers for his own topical indexes, which provide a picture of what interested him in the book and how he traced those interests to specific points in the text. See the article for images and more detailed examples.

In my follow-up presentation at NYU last week, I considered a question that’s been on my mind since completing the article: what if McLuhan had been doing his reading on the digital platforms we have today? How might his annotation practices have been different (or not)? And what would we be able to learn from digitally annotated books as sources of evidence? This last question in particular brings me back to the themes I’m exploring in my Veil of Code project.

To explore both questions — the first being about practices, the second about evidence — I experimented with different digital reading platforms, and attempted to mark up the opening of the Proteus episode in Ulysses the way McLuhan did in his 1961 Random House edition. For example, this image from my slides shows my inexact reproduction of McLuhan’s notes, using a Project Gutenberg PDF version of Ulysses and annotated using the default macOS Preview app:

Proteus slide.jpg

As you can see, it’s possible in Preview to add notes that aren’t anchored to a specific word or phrase in the text. There’s no such flexibility in the Kindle or Apple Books desktop apps. My guess is that McLuhan and any other skilled annotator would have found themselves fighting these interfaces. Sometimes a general, even vague thematic suggestion like “Lear” written next to a paragraph on the theme of vision can help spark ideas upon re-reading, and the spontaneous scribble remains an important part of being an annotating reader—at least for those using more permissive platforms like Preview.

However, for the purposes of my Veil of Code project and this blog, I’m more interested in what we can learn from the digital files that are generated by digital annotations. For example, one of the persistent and often unanswerable questions I had when reading McLuhan’s marginalia was when did he write this? His 1961 Random House Ulysses contains layers of marginalia which may have been written days or years apart, perhaps as part of focused re-reading with a project in mind, or simply in the course of revisiting Ulysses after a long absence. Marks on paper can tell us a lot, but they can’t answer these questions.

Digital marginalia can become a goldmine for answering questions like these. For example, if you annotate in Apple Books (formerly known as iBooks), you can locate a SQLite database file containing those annotations. It’s buried fairly deep in the system files, but it’s not inaccessible. On my MacBook Pro (running macOS Catalina v10.15.3) the file path is: Macintosh HD/Users/[username]/Library/Containers/com.apple.iBooksX/Data/Documents/AEAnnotation/ AEAnnotation_v10312011_1727_local.sqlite. Although I made my annotations on a DRM-protected ebook (the Penguin Modern Classics edition of Ulysses), the separate annotations database file isn’t encrypted, and can be viewed using open-source software like DB Browser for SQLite. Here’s a simplified view:Apple Books SQL db with annotations.png

Each row corresponds to an individual annotation, and each column (left to right) shows date created, date modified, note content, and the annotated text in the ebook. The database records other information about each note, but even these four data points would answer the when? question with precision, down to the second. (The timestamps look unreadable here only because they’re recorded in the Apple Cocoa Core Data format, which counts the number of seconds that have passed since midnight GMT on January 1, 2001; see this helpful site for an explanation and easy-to-use converter).

If you’re wondering why some of the notes repeat in the screenshot above, that’s a question that leads to the most interesting thing I discovered in this experiment. Not only does the Apple Books SQLite database record the precise moments of creation and modification for each annotation, it also retains a record of deleted notes. The repeated notes that read “touch” and “sign = miracle” in the screenshot above were created inadvertently by my creation and deletion of notes while getting my screenshots ready for my presentation slides. (You can even use the timestamp converter linked above to figure out exactly when I was doing it.) The final note (row 62) is one I added and then deleted from my Ulysses ebook in Apple Books, yet here it is preserved in the database.

I still need to profile how and when exactly deleted notes leave a trace in the Apple Books SQLite file. In any case it would be a remarkably rich source of evidence for a future historian who had access to an archived disc image of an annotating digital reader’s hard drive. (For real-world examples of this kind of research scenario, see Matt Kirschenbaum’s book Track Changes: a Literary History of Word Processing.) A database file like this would be immensely valuable to someone studying the reading habits of a particular reader. One wouldn’t even need to invest effort to create a query-friendly database of the annotations; the SQLite file already is a database, and can be searched and queried using software like DB Browser for SQLite, shown above.

Then again, even if we can know with absolute precision when a reader created or modified an annotation, or even read their deleted annotations, have we really learned what matters most about their acts of reading? Sometimes with fields like digital forensics, the volume of precise but narrowly focused evidence it can recover can lead us to overemphasize what’s available, and to neglect what we can’t see. McLuhan’s marginalia were part of a larger system of reading, and they point to ephemeral aspects of reading that can’t be materialized, yet were nonetheless real. For this reason, reading McLuhan’s annotated books at times feels like a window into his scholarly workshop, and at other times feels more like a mirror that reflects one’s own forensic impulses back onto the researcher, forcing a reckoning with the limits of evidence and the elusive nature of reading.

Looking back at the timestamps for my fake McLuhan annotations in the screenshot above, I’m reminded of the afternoon when I was experimenting with the Apple Books annotations, in preparation for the Reading McLuhan Reading symposium. Taking a break, I walked my dog, Boomer, around Toronto’s Wychwood Park neighbourhood, not  far from my home, where McLuhan happened to have lived while he taught at the University of Toronto. Walking past the old McLuhan house, I thought of the strange and sometimes unhealthy obsession that some of the hardcore McLuhanites seem to have with his legacy, and wondered if any still make pilgrimages to the house looking for some kind of special insight. As we were walking, I also thought back to two readings I’d been revisiting for the NYU presentation, John Durham Peters’s book The Marvelous Clouds (mentioned above), which considers nature as media, and Jody Berland’s article in the “Many McLuhans or None at All” special issue, titled “McLuhan and Posthumanism: Extending the Techno-Animal Embrace,” which draws on the field of animal studies to reconsider McLuhan’s ideas about the senses. (By coincidence, Boomer sometimes plays with Jody’s dog at the nearby dog park; Toronto is a small world sometimes.)

All this prompted a reflection on media and what one might call the ineluctable modality of the recoverable. McLuhan, of course, is best known for drawing attention to the ways media reconfigure relationships between the senses and knowledge. While I had spent the afternoon looking for answers about McLuhan’s reading in manuscript and digital annotations, Boomer was busy picking up myriad scents and signals from the days-old snowbanks in McLuhan’s old neighbourhood. As dog owners know, a simple walk in the park can be a powerful lesson in how our most important companion animals (sorry, cats) experience the same space but a different world thanks to their differently balanced senses—especially the sense of smell, which has strong links to memory and the past. On that quiet winter afternoon, who could say which of us was more in tune with their own sensorium, and which was the better reader of the faint traces of those who went before?

My money’s on the dog… look at that nose.

01428EEB-5967-46F2-87B6-885E55200CFA_1_105_c.jpeg

Boomer, an olfactory forensics expert, with the McLuhan house in Toronto’s Wychwood Park in the background.

Analyzing Ebooks in the Age of Digital Locks: Challenges and Strategies

The following post was written for the forthcoming final report of the Books.Files project, led by Matthew Kirschenbaum and funded by the Andrew W. Mellon Foundation. The project’s final report is publicly available here, and you can also read Matt’s description of the Books.Files project’s rationale in Archive Journal. I presented an early version of this material at the 2019 conference of the Society for the History of Authorship, Reading, and Publishing (SHARP). For helpful feedback and conversation on these questions, I am grateful to the audience at SHARP, Matt Kirschenbaum, Victoria Owen, and Simon Stern.

What can an e-book reveal about the history and social contexts of its making, or the collaborative nature of its construction? This is the kind of question that bibliographers have been asking—and answering—with regard to printed books for many years, and it is a viable question for ebooks as well. However, ebooks are made of code organized into files, and it is nearly impossible to answer a question like this if those files are not accessible, along with digital publishers’ records generally. Scholars in fields ranging from analytical bibliography to book history to video game studies have emphasized the importance of first-hand analysis of digital objects at the level of code, and not just what we see on the screen. If we wish to understand the relationships between an ebook’s form and functionality, or if we need to account for an apparent error in an ebook’s construction, or if we are curious about plans for an ebook’s design that may have been abandoned but left vestigial traces in the code, we will need to look for evidence that can only be found within files that are increasingly walled off behind digital locks. The idea that the code, and not just the visible interfaces of ebooks and other digital objects, can yield insights about their natures is foundational to fields that care about materiality, and an important avenue of potential discovery. In this post, I’ll consider the challenges facing the code-level study of ebooks in a world of Digital Rights Management (DRM) systems, in which they are increasingly published with digital locks (known formally as Technical Protection Measures, or TPM) that impede direct access to them as primary evidence.

Ironically, digital locks themselves may be easily broken with tools that are not difficult to find on the web; the greater challenge, which is my focus here, is that the act of breaking TPM—or sharing the tools to do so—may fall within a grey area of copyright policy and law. Dan Burk, who works at the intersection of copyright law and digital materiality, articulates the crux of the problem: “Lacking the deliberative nuance of human agency, DRM lacks the flexibility to accommodate access or usage that is unforeseen, unexpected, or unanticipated.”[1] For the most part, DRM and digital locks are deployed under a narrowly positivist paradigm that assumes all possible uses of a text are knowable and codifiable in advance. Those who study books and reading, in all their forms, know that’s not true and never has been. Breaking digital locks for unauthorized resale or distribution on torrent sites is not something I would defend—it’s happened to my own books—but can there be a legitimate rationale for breaking digital locks in the service of scholarly research on digital artifacts?

In the United States, the Digital Millennium Copyright Act (DMCA) broadly prohibits the circumvention of digital locks on copyrighted materials, regardless of intention, and prohibits trafficking in technologies that facilitate circumvention. Section 1201 of the DMCA provides for exceptions to the DMCA’s anti-circumvention prohibitions, and those rules are revised every three years in conjunction with the Librarian of Congress. The European Union’s Information Society Directive contains similar prohibitions, and it also has a mechanism for member states to establish valid exceptions (e.g. breaking digital locks on an ebook to enable screen-reading software to work, often necessary for readers with visual disabilities). However, in the EU and the United States, there has been widespread concern that even with these mechanisms, DRM nonetheless inhibits uses of digital objects that should be—and in many case, are—legal and protected by the doctrines of fair use and fair dealing.

As a digital bibliographer based in Canada, I do my work in a country where these issues are far from settled. From 2017 to 2019, the Canadian Copyright Act underwent a statutory review, and a Parliamentary committee travelled the country to receive feedback from stakeholders. My own 2012 study of an ebook’s source code was one of many examples presented to the committee to support the idea that there are valid reasons for TPM circumvention. Remarkably, the committee’s final report (released in June 2019) recommends a balanced approached to TPM circumvention, including a non-exhaustive (“such as…”) approach to enumerating reasonable exceptions to copyright. Even fierce defenders of the public domain such as Michael Geist received the report optimistically, but whether its recommendations will become Canadian law is another question—and, even so, that may be cold comfort outside the borders of my home country.

So where does this leave someone who wants to sit down and dig into the code of an ebook right now, to see what they can learn? A university-based digital bibliographer wishing to examine the source code of an ebook may know precisely how to access its code, but may be more uncertain as to whether she can do so without violating copyright law, policies of universities or funding agencies, or the terms of End User License Agreements. The stakes are even higher for those in positions of precarity, and the chilling effects of uncertainty about DRM circumvention for scholarly purposes are very real. I’ll conclude with an outline of five possible responses to this scenario that I’ve identified (and named in the spirit of tvtropes.org), though none of them may be adequate on their own. I also hasten to add that these are descriptions of practices, not recommendations or legal advice (which I’m not qualified to offer). Scholars contemplating these kinds of strategies should always seek advice from someone qualified and authorized to provide it, such as a university copyright librarian.

  1. The “what happens in Vegas…” approach: breaking digital locks in the course of one’s research, but omitting discussion of how one broke them, or acknowledging that one broke them at all. This has the advantage of being a genuine path to knowledge about the artifact under study, and it provides the researcher with evidence that can answer many bibliographical questions. The downside is that one can’t be fully transparent about one’s methods, one can’t do this with students (or with peer workshops like those at the SHARP conference or Rare Book School), and one might still be breaking the law. 
  2. The “Thor Heyerdahl” approach: instead of breaking digital locks on the object under study, building a replica as a kind of manipulable model, and hoping it behaves analogously to one’s real object of study. Working in the spirit of experimental archaeology, it is much easier to create an ebook using an open standard like EPUB than it was for Thor Heyerdahl to build and sail his experimental ship, the Kon-Tiki, and one can test hypotheses about ebooks in safer environments than the waters of the South Pacific. This approach can work quite well in an educational context, but only with relatively simple digital objects using open standards like EPUB, and conclusions based upon it must rely on probability and conjecture rather than empirical evidence. 
  3. The “Spotify Teardown” approach: modelling the algorithms that govern a digital system by manipulating its inputs and examining the results. This strategy takes its name from the recent book Spotify Teardown: Inside the Black Box of Streaming Music, written by a group of researchers who wanted to understand the algorithms that govern Spotify’s behavior as a music distribution platform. Their multi-pronged set of methods included creating their own music label for research purposes, and using it to upload files that tested Spotify’s behaviors in various ways. This approach can work for those interested not just in a single digital thing, like an ebook, but in systems that circulate many digital things. Disadvantages include those for the “Thor Heyerdahl” approach mentioned above, and the possibility of legal pressure from the company under study (which the Spotify Teardown authors—and their funding agency in Sweden—successfully resisted). 
  4. The “grateful lurker” approach: documenting how online communities who care about certain kinds of digital artifacts are discussing them, curating them, and sometimes breaking them open to understand how they work—and how they share their evidence online. This strategy works especially well for video games, many of which have thriving online communities of modders: people whose work to repurpose video game engines to create new games often leads to discoveries about the original game’s development process. The evidence for those discoveries often comes in the form of abandoned design features or digital assets that the developers neglected to remove from the source code—both forms of evidence that usually require digital lock-breaking to access first-hand. I have also adapted this approach to the study of digitally curated musical recordings, though ebooks may not benefit from the same levels of dedicated online communities. The main advantage, of course, is that someone else is doing the digital lock-breaking—and they may document their methods and analysis to a reasonably high standard of evidence, sometimes even with informal community peer-review. However, second-hand lock-breaking also means second-hand evidence, which may not meet the empirical standards of scholarly researchers. Nevertheless, the “grateful lurker” approach does harmonize nicely with book history’s emphasis on reception and what D.F. McKenzie called the “sociology of texts,” and can shine a light on the valuable cultural heritage work done by online pro-am communities (i.e. amateurs whose work approaches or reaches a professional standard). Yet not all online communities may want that kind of light shone on them; following research ethics protocols for studying online communities is therefore essential. 
  5. The “Tom Petty” approach (cf. his lyrics to “I Won’t Back Down”): breaking digital locks openly and unapologetically on the understanding that one is acting reasonably within the limitations to copyright, with no intention of infringement or piracyand then standing one’s ground. Disadvantages are obvious, as are advantages. Less obvious, but no less real, are the networks of support and advocacy for those whose scholarship sometimes requires the protection of the law.[2] In ideal circumstances, this is not so much a challenge to copyright law as an opportunity to clarify its purpose and limits. Not an approach to try alone, but then again neither is most digital scholarship.

[1] Dan L. Burk, “Materiality and Textuality in Digital Rights Management,” Computers and Composition 27 (2010): 231.

[2] A good place to start is Patricia Aufderheide and Peter Jaszi’s book, Reclaiming Fair Use: How to Put Balance Back in Copyright, 2nd ed. (University of Chicago Press, 2018), especially their chapter “The Culture of Fear and Doubt, and How to Leave It,” 1–16.

Archivaria article on the Tragically Hip

[Update: as of November 2021, I’ve revised some parts of this post in light of TheHip.com’s removal of lyrics and streaming audio, HipBase.com’s removal of its live music archives, and—on a happier note—the official release of some previously rare recordings on the EP Saskadelphia and the Road Apples 30th Anniversary Edition Box Set. In some cases I’ve removed links to unofficial versions of songs where official releases now exist.]

“In this gradual transformation of the archivist from passive keeper guarding the past to active mediator self-consciously shaping society’s collective memory, the archive(s) itself is changed from an unquestioned storehouse of history waiting to be found to itself becoming a contested site for identity and memory formation.”

— Terry Cook, “The Archive(s) Is a Foreign Country,” Canadian Historical Review 90, no. 3 (2009), p. 533

“Our conversation is as faint a sound in my memory
As those fingernails scratching on my hull.”

— The Tragically Hip, “Nautical Disaster,” Day for Night (1994)

Terry Cook’s comments on the changing role of archives invite us to reconsider who is doing the work of archiving in the present, especially in contexts where traditional archival institutions haven’t always ventured. In an article just published in Archivaria, titled “Looking for a Place to Happen: Collective Memory, Digital Archiving, and The Tragically Hip,” I look at the online archival practices of fans of Canada’s most popular rock band. A public version of the article is available via the University of Toronto’s institutional repository (with thanks to Archivaria for their open-access and self-archiving policy):

Alan Galey, “Looking for a Place to Happen: Collective Memory, Digital Music Archiving, and The Tragically Hip,” Archivaria 86 (2018): 6-43. [https://tspace.library.utoronto.ca/handle/1807/92528]

Although Canada is home to some excellent music archives — including the Media Commons Archives at my home institution — the field is still challenged by questions about how best to preserve something as ordinary (and extraordinary) as a concert tour. This post describes some of the themes that I develop in more detail in the article, and also provides some links to images and audio recordings that I wasn’t able to include in the published article. (If you’re here for the audio links, just skip down to the end.)

We can learn a lot from The Tragically Hip as a case-study in digital archiving by online communities for a few reasons. For one, The Hip had a longstanding policy of allowing audience taping of their concerts for non-commercial purposes. As with The Grateful Dead and other taping-friendly bands, a large corpus of fan-curated performances accumulated over The Hip’s 30+ years of touring. Much of this material has made its way online, especially thanks to the website HipBase.com, which functions as a setlist archive and digital repository for audience recordings (and a few FM radio broadcasts). [Update: as of 2021 HipBase’s repository of live recordings is no longer available, which is a great loss for fans and researchers alike, but the setlist database is still available.] As longtime Hip fans know, these recordings can shed light on the band’s songwriting process thanks to lead singer Gord Downie’s tendency to experiment with lyrics in performance, ranging from dropped-in phrases to whole stories told mid-song — any of which might show up in songs on the next album. For this reason, not to mention Gord’s and the band’s typically energetic delivery, no two Hip shows are exactly alike, and their live performances offer a glimpse into their songwriting workshop.

Air Canada Centre - Barilko banner

The Tragically Hip performing at Toronto’s Air Canada Centre on the 2016 Man Machine Poem Tour. Uncredited image from a Toronto Maple Leafs Twitter post on 18 October 2016.

In addition to fans’ online curation of Hip recordings as history, another reason to consider The Hip from an archival perspective is that their music is so often about Canadian history. Some have blamed this for their relative lack of popularity outside of Canada, but it undoubtedly contributes to their massive popularity within their home country.

For example, in the foreground of the image above hangs the retired #5 banner for legendary Maple Leafs defenseman Bill Barilko, whose mysterious disappearance is the subject of the song “Fifty Mission Cap” (with lyrics adapted from a hockey card). When the band would perform the song at the Air Canada Centre, and especially in its predecessor, Fifty Mission Cap, Toronto 1995Maple Leaf Gardens (where Barilko played with the Leafs), the Toronto audiences would usually go nuts. The image on the right comes from a video of a 1995 concert in Maple Leaf Gardens, apparently shot with a video camera hidden under a coat, and shows the climactic moment when the song was first performed on Bill Barilko’s home ice on the Day for Night tour. Barilko’s banner would be visible to the crowd thanks to the stage lights which illuminate the arena during each chorus. (You can find the video here; the song starts at about 26 minutes. See also Michael Barclay, Ian A.D. Jack, and Jason Schneider’s discussion of this moment in their book Have Not Been the Same: The CanRock Renaissance, 1985–1995, rev. ed. [Toronto: ECW Press, 2011], p. 611.)

The band’s songs are filled with enigmatic historical references to Canada and beyond — though not usually in a straightforward name-checking, historical-plaque kind of way, but more often with a poetic twist that complicates how we relate to the idea of history. For a band that’s so closely associated with Canada as a nation, right down to the Canadian flags in their logo and merchandise, their music is anything but jingoistic nationalism. The complexity and topicality of the Hip’s lyrics has inspired an unusual amount of research and interpretation by fans, best represented by Stephen Dame’s website, A Museum After Dark.

Finally, the Hip should be of particular interest as a case-study in music archiving because their final tour, in 2016, took place after Gord Downie had been diagnosed with brain cancer. Although the band did not frame it as a farewell tour — it was ostensibly the summer tour for their new album, Man Machine Poem — the public knowledge about Gord’s diagnosis meant that fans were certain they were seeing the Hip for the last time on stage. When I was lucky enough to see them play in Toronto on August 12th, it was the most emotional concert I’d ever seen, and likely ever will. Also one of the best.

From an archival perspective, this was an extraordinary set of circumstances thanks to a fan community who were already used to documenting the band’s performances with great care, but who were now attending concerts (if they could get tickets) with a sense of history unfolding before their eyes (and cellphone cameras), but also with a sense of how unique and special these performances were as live events.

Tragically Hip Aug20'16 7

Fans gathered to watch The Tragically Hip’s final concert on a video screen in Kingston’s Springer Market Square near the Rogers K-Rock Centre arena, where the concert took place. Image courtesy of Kelly Turner.

It was such that the CBC took the unusual step of broadcasting the band’s final concert, in their hometown of Kingston, Ontario, live and commercial-free. Not only that, but the CBC pre-empted their own primetime Olympics coverage to show the broadcast. The Prime Minister, Justin Trudeau, was present (in jeans and a Hip t-shirt, just like everyone else). Thousands of people travelled to Kingston just to watch the concert outside the arena, and there were viewing parties all over the country in bars, backyards, and public parks. As I mention in the article, an estimated 11.7 million people watched the broadcast, which means that one-third of the population of Canada stopped what it was doing to watch a rock concert together.

I think there’s much here to be learned about communities, documents, performance, the experience of history, and especially the digital archiving that happens outside of traditional cultural heritage institutions. The Archivaria article unpacks these ideas in detail, and I don’t want to repeat too many of the same points here. However, one thing I wasn’t able to do in a traditional article format was integrate audio of the songs in question. I wrote this post and included some links below partly to make up for that.

As I was writing the piece during the spring and summer of 2018, I was re-listening to the Hip’s recordings to let the music and words sink in. Obviously the published studio and live albums were a big part of that, but I also had the good fortune to have access to the Doug McClement fonds at the University of Toronto’s Media Commons Archive. Doug McClement has worked as an audio engineer in Ontario since the late 1970’s, and the numerous archival boxes of DAT’s, CD’s, and other formats which he donated to U of T include numerous soundboard recordings of Hip concerts over the band’s career. (This collection is really amazing and extends well beyond The Hip. I’m planning another blog post about it down the road.)

IMG_0661

A peek inside one of many, many boxes of DAT soundboard recordings from the Doug McClement fonds at the University of Toronto’s Media Commons Archives.

For example, I remember when I first heard the soundboard recording of an extended version of “New Orleans Is Sinking,” performed at Fort Henry in Kingston, Ontario, in 1991, which includes an alternate version of Gord’s famous “Killerwhaletank” story. Most fans probably know “Killerwhaletank” through an unofficial recording that still circulates on the web (sourced from a Westwood One radio broadcast of a 1991 show at The Roxy in West Hollywood, later released as a B-side on the “Long Time Running” single, and released in 2021 along with the rest of the 1991 Roxy show on the Road Apples 30th Anniversary Edition box set). But the Kingston version gave me the shivers when I heard a fragment of what would become “Nautical Disaster,” played over a climactic moment in Gord’s story about a hapless aquarium worker caught in a cetacean love triangle. It’s just a brief guitar figure played over two chords (Em–Dmaj), but it helped me understand how the band approached the relationship between performance and composition, and between the stage and the studio.

From that point I started trying to recover the live workshop moments that helped the band craft what would become the song “Nautical Disaster.” Thanks to the McClement fonds and HipBase.com, that involved listening to every surviving live recording of “New Orleans Is Sinking,” in order, during the years leading up to the release of “Nautical Disaster” on Day for Night in 1994. The funny thing about doing archival research on creative works is that sometimes the research process begins to take on the hue of the images and themes from the materials themselves. While attempting to follow “Nautical Disaster” through its composition history – that is, the incomplete history documented by live recordings – I couldn’t shake the uncanny connections to this song about an individual’s memories, told through images from a dream about the sinking of the Bismarck, with its haunting depiction of survivors’ guilt imagined as the sound of fingernails beneath the hull of a lifeboat. It’s a song about the seeming arbitrariness of survival and loss. Archivists and archival researchers have to reckon with the same thing, and fans of the band were reckoning with that arbitrariness, too, as they said goodbye to Gord on the final tour.

Thankfully, a great many of the band’s unpublished performance recordings have survived, and the McClement fonds includes many (like the 1991 Kingston soundboard) that have not circulated. The Media Commons Archive cannot publish these recordings because they don’t hold the rights, and they’re careful to ensure that unpublished recordings are not leaked, and can only be heard on-site at Media Commons in the Robarts Library complex. The trust relationship between donors, rights-holders, and media archives is essential because without it, we might not have access to these materials at all. (The McClement fonds are currently accessible to researchers on-site; see this page.) While I can’t share access to any of those soundboard recordings — I don’t even have copies myself — there are audience-recorded versions of some of the same performances on HipBase.com, possible thanks to the band’s longstanding policy of permitting taping for non-commercial use. (For example, see their statement on audience recording for the 2016 tour, which was still online at the time of posting.)

In the article, I also mention how fans initially circulated copies of the CBC broadcast online. Now that it’s been officially released, I recommend supporting the artists by purchasing it in its published form: The Tragically Hip: A National Celebration (Universal Music Canada, 2017). Just to be clear, bootlegging in the form of illicitly recording artists’ work and selling it for profit is piracy, and I don’t support it. It’s also important to understand that the online communities I focus on in the article are amateur tape-traders, not pirates. They curate and share audience recordings, which in the Hip’s case are made for non-commercial use with the band’s consent. Other kinds of online communities actually undercut the pirates by publicly sharing bootlegs online.

Among these communities there’s an ethos of supporting the artists by purchasing their official releases (and concert tickets, t-shirts, etc.). Serious bootleg-collecting blogs even take down individual tracks if the artists subsequently release them on official live records. I have no interest in validating music piracy, but it would be worthwhile for recording artists and music archivists alike to understand how some valuable archival work is being carried out by pro-am collectors, many of whom are preserving music history that might otherwise be lost. I wrote the article partly to pay tribute to the work they do.

If you read the article, I suggest the following playlist to go along with it. This includes the album versions of some of the songs I mention (such as “Fifty Mission Cap”), but also links to some of the unpublished recordings which are the focus of the article. The following list more or less follows the order of songs that are discussed in any detail:

  1. “Gift Shop,” from Trouble at the Henhouse (MCA Records, 1996)
  2. The most well-known example of “New Orleans Is Sinking” being used as a workshop song is one where Gord and the band share some work in progress through an extended jam. It became known as the “Killerwhaletank” story (one word). It was recorded at The Roxy in West Hollywood in 1991, broadcast on the Westwood One radio network, released only as a B-side to the single “Long Time Running,” and circulated on a bootleg known as Live at the Roxy (with some variants). An official, published copy is hard to find (I tried) but there are numerous versions from the bootleg on YouTube. In 2021, the entire 1991 Roxy show was released as part of the Road Apples 30th Anniversary Edition box set (though the band chose not to include the Roxy show in the version of the box set for streaming services).
  3. A second, lesser-known but (imho) better version of the “Killerwhaletank” story can be found in an extended performance of “New Orleans Is Sinking” from a concert in Kingston, Ontario, on 29 August 1991. There’s a soundboard version in the McClement fonds, but an audience recording of the same performance can be found on HipBase.com’s live archive (see the section labelled “Master-MP3”; here’s a direct link). You can hear the chord changes that would become part of “Nautical Disaster” starting at about 5:20, just as Gord describes getting in the water with the whales. [Update: as of November 2021, HipBase.com’s live recording archives are available only for a limited time as torrents. If the collection is reposted elsewhere, I’ll update these links.]
  4. The first full performance of “Nautical Disaster” as a complete song (that I’m aware of) took place at the Kumbaya Festival at Ontario Place in Toronto on 5 September 1993. Gord introduces “Nautical Disaster” by name as a new song, though the band still performed it as an interpolation in the middle of their workshop song, “New Orleans Is Sinking.” An audio version sourced from a TV broadcast can be found on HipBase.com (in the section labelled “Master-MP3”; direct link here), and the video itself can be found on YouTube (the song starts at about 16:30; also check out “At the Hundredth Meridian” for lyrics that would later show up in “Scared.”)
  5. “Nautical Disaster” in its final version on the Day for Night album (MCA Records, 1994).
  6. “Fifty Mission Cap,” from Fully Completely (MCA Records, 1992).
  7. “Wheat Kings,” from Fully Completely (MCA Records, 1992).
  8. “Montréal,” a haunting song about the 1989 mass shooting at the city’s École Polytechnique, was recorded during the sessions for 1991’s Road Apples and occasionally performed but never officially released. The lyrics are posted on the band’s website in their “Unreleased Songs” section. There are numerous versions of the studio version to be found on YouTube. As I mention in the article, the most significant performance of the song was its last: in the city of Montéal on 7 December 2000, captured on a fan-shot video posted to YouTube (the song begins at about 1:49:40; again, thanks to Tony Rampling for posting). [Update: “Montréal” is included on the 2021 EP Saskadelphia, which makes available previously unreleased songs from the 1990 recording sessions for Road Apples. In a significant departure from the rest of the EP, the band chose to include the 2000 live version described here and in my article, not the 1990 studio version.]
  9. Another mid-song story from the same 1991 Roxy show that brought us “Killerwhaletank”: this time, the darkly comic “Double Suicide” love story, told over an extended jam in the middle of the song “Highway Girl.” As with “Killerwhaletank,” linked above, there are numerous versions from the Live at the Roxy bootleg on YouTube.  The only official (and hard to find) release of this live version was as a B-side to the single “Twist My Arm” from Fully Completely. In 2021, the entire 1991 Roxy show was released as part of the Road Apples 30th Anniversary Edition box set.
  10. “Looking for a Place to Happen,” from Fully Completely (MCA Records, 1992).
  11. As I mention in the article, on the final 2016 tour Gord didn’t say a whole lot to audiences from the stage. But on the Toronto concert on August 12, he happened to make some poignant comments about memory and recording during the ending of the final song of the night, “Ahead By a Century.” You can find an audience recording of that show on HipBase.com (in the section labelled “Master-FLAC”; direct link here). [Update: as of November 2021, HipBase.com’s live music archive is no longer available except as a time-limited torrent. An audience recording of the show is still available for download here: https://www.guitars101.com/threads/the-tragically-hip-2016-08-12-toronto-on-aud-flac.276489/.]  I was lucky enough to be at that show, but as I mention at the end of the article, I wouldn’t have known what Gord said at the end if someone hadn’t recorded and posted the show. (Thanks, whoever you are.) You can read more in the published article, and I’ll leave it to the recording to speak for itself.

If you read the article, I hope you enjoy it. (Sorry it’s so long.) If you’re new to The Tragically Hip and other Canadian bands like them, I hope you enjoy the music and support the artists however you can. And if you’re a longtime Hip fan or tape-trader, I hope you’ll check out the work of archival researchers in open-access journals like Archivaria (currently everything up to 2015 is available), and see some common purpose in making the materials of popular cultural heritage accessible for the future. There are a lot of music fans out there exploring archives of all kinds, and there’s more work to do.

Five Ways to Improve the Conversation About Digital Scholarly Editing

Welcome to the blog for my new research project! Except that it’s not really new: like many research projects, this one pulls in some threads I’ve been working on for some time, often at the edges of my other projects. I don’t plan to recycle content here, but for my inaugural post it seems appropriate to return to something I posted back in 2016 as an invited response to the Modern Language Association’s Committee on Scholarly Editions recent white paper, “Considering the Scholarly Edition in the Digital Age.” The article-length post below originated as remarks for a roundtable at the 2016 Society for Textual Scholarship conference in Ottawa, and was published in expanded form on the CSE’s blog. (My thanks to the conference organizers and the CSE for the invitation to contribute.) Readers looking for a short version are invited to skip down to the numbered items in bold—in fact, point number 5 happens to be the premise of my Veil of Code project, about which I’ll be posting semi-regularly (and more concisely) here. You could call this a road map for how I arrived from my earlier work in Shakespeare and digital editing to this new project.

*

When I first read the CSE’s white paper “Considering the Scholarly Edition in the Digital Age” I couldn’t help but think back to David Greetham’s pointed review of an earlier statement on digital scholarly editing sponsored by an earlier incarnation of the CSE. The collection Electronic Textual Editing was published by the MLA in 2006, with open-access preview versions of its contents available thanks to the Text Encoding Initiative. Prefacing its commissioned original articles are two largely uncontextualized statements issued by the CSE in the past: “Guidelines for Editors of Scholarly Editions” and “Guiding Questions for Vettors of Print and Electronic Editions”. Greetham’s review praises the strengths of individual articles in this collection — I continue to reference and assign some of them myself — but also skewers the confused logic and tone of the collection as a whole, calling it an “ex cathedra statement from the policing organization of our discipline(s)” (133). Greetham’s vigilance against the subtle creep of institutionalized orthodoxy was not out of place, then or now. Yet the CSE’s most recent white paper is refreshingly free of any doctrinal tone. Instead, as our co-panelist John Bryant says in his post, it’s “a delightfully diplomatic document.” Its framing as a white paper, not guidelines or best practices, marks a clear invitation to discussion and debate, with all topics on the table.

Discussion and debate — the informed and intelligent kind — is exactly what we need as digital humanities increasingly becomes the context in which textual work like scholarly editing takes place. My aim in this post is to consider ways to raise the level of that debate. Before unpacking five specific ways we can improve the conversation, I’d like to compare and contrast how that conversation has unfolded in the CSE white paper and in another, far more contentious document.

In the weeks since our roundtable took place at the Society for Textual Scholarship (STS) conference in Ottawa, I’ve been thinking about the CSE white paper in context with another recent public invitation to debate: a controversial Los Angeles Review of Books piece called “Neoliberal Tools (and Archives): a Political History of Digital Humanities,” by Daniel Allington, Sarah Brouillette, and David Golumbia. No doubt the LARB article would have animated our discussions in Ottawa had it appeared slightly earlier — though whether it would have raised the level of discussion is another question, to which I’ll return in moment. Notably, both documents converge quickly on the same specific point. The LARB authors express it in their opening, when they associate digital humanities with a “discourse [that] sees technological innovation as an end in itself.” The CSE white paper, published just a few months earlier, points to a similar concern when it cautions that “the digital is neither inherently a site of innovation nor a necessarily useful innovation in itself.” Good advice, especially when it comes from textual scholars who study the longue durée of textual transmission and its technologies.

One of the crucial differences between the two pieces is that the LARB article takes “Digital Humanities,” as such (with capital letters), as its object of analysis. That, I believe, is a mistake for anyone wishing to raise the level of discussion. After more than 15 years of doing research that could be categorized as digital humanities (and using the label sometimes myself as a tactical term) it’s not the term or the work that I find increasingly unsatisfying; it’s the baggy logic of the category itself. To be clear, I could never imagine walking away from many kinds of work that people call digital humanities, as the LARB authors seem to advise, and as Alan Liu promptly critiqued them for doing. Rather, it’s that I’m more interested in exploring how my own digital scholarship refracts into more specific topics, like born-digital bibliography and humanities-oriented visualization and the pedagogy of text encoding and the labour of digital scholarly editing and reading media history through Shakespeare. Scholarly editing, the focus of the CSE white paper, is likewise specific enough to support an intelligent conversation.

So while I’m sympathetic to what Allington, Brouillette, and Golumbia were trying to accomplish in their critique of neoliberal labour practices and the digital humanities — in which scholarly editing is most certainly implicated — I’m also disappointed that the article that resulted fails to raise the level of debate about textual labour in the academy. As other commentators have pointed out, the LARB piece was published with several  structural weaknesses: its history is patchy and selective, allowing parts of the digital humanities to stand in for some imagined whole; it underplays the politically and socially progressive work done by many archivists and scholarly editors; it’s awkwardly fixated on what seem (to me, at this distance) to be local and personal disputes originating at the University of Virginia; and — this is where it really lost me — it lumps one of the best textual scholars and critics of digital instrumentalism and neo-positivism, Johanna Drucker, in with the very things she critiques so well (here, for example, and especially in her first two chapters here). (See also Juliana Spahr, Richard So, and Andrew Piper’s insightful and measured response in the LARB.)

One lesson we might take from this comparison of the CSE and LARB documents, then, is that textual scholarship, as a field which studies the intersections of labour, culture, and technology, would be a good starting place to mount an intelligent critique of… well, maybe not “Digital Humanities” but a more constructive and specific framing of the structural problems that the LARB authors have in their sights. For example, Amy Earhart and our co-panelist Peter Robinson have recently critiqued certain misconceptions about textual scholarship that are prevalent among many digital humanists. The political implications of their critiques, which are solidly grounded in the history of scholarly editing, should be of interest to those who looked for a more substantial analysis in the LARB piece.

Indeed, the discussion to which the CSE white paper is contributing would be improved by a frank and evidence-based critique of specific, identifiable tendencies in the digital humanities, just as the STS community questioned the orthodoxies of the New Bibliography many years ago. (I’d like to think there’s some shared intellectual DNA in the current #transformDH community, which the LARB article puzzlingly neglects to mention.) But we need to do the homework for that critique to be substantive and translatable into action. That homework includes looking beyond the United States to understand what’s being called digital humanities, looking beyond the post-2009 blogosphere/twitterverse for the writing that matters, and looking beyond English departments. Scholarly editing, after all, is not entirely about literature, just as textual scholarship is not entirely about scholarly editing.

On that note, the LARB piece does usefully raise the question of the “us” who’s having the conversation about textual scholarship and labour in the twenty-first century. Like Johanna Drucker, I’m a humanities scholar based in an Information school. Does that mean I have an “idiosyncractic relationship to the humanities,” as the LARB authors say Drucker does? (I certainly hope so: all my favorite colleagues are idiosyncratic…) And like Matthew Kirschenbaum in his response to the LARB article, I find myself less concerned whether I’m a digital humanist by someone else’s definition, and reflecting more upon the not-entirely-categorizable experience of doing digital textual work — the kind that the CSE white paper is concerned with. For better or worse, my own experience means that I don’t view scholarly editing, bibliography, book history, or any of the CSE white paper’s topics with a literature department as my primary frame of reference. Yet one might argue that textual scholarship itself has an idiosyncratic relationship to the humanities. It can certainly be a welcoming field for those who do.

So, based on my own idiosyncratic experiences of doing digital textual scholarship in and out of English departments over the past 15 years or so, here are five suggestions for ways to raise the level of discussion, or at least take it somewhere new:

 

1. Don’t get hung up on the question “what is a (digital) scholarly edition?”

The CSE white paper spends a lot of time on this, motivated by a well-intentioned desire to open rather than settle the question. But as my book history students once told me in one of my first classes at the University of Toronto — after I’d glibly thrown “what is a text?” down on the seminar table a few times too many — we can play the definitional game all day and get nowhere. It’s more useful to shift the question into the gerundive: not “what is a scholarly edition” but “what is scholarly editing?”; and by extension, not “what is an archive?” but “what is archiving?” The gerund makes it an activity, which humanizes the question by making it about agents.

This point is motivated by my own experience in seeing the history of technology differently when I stopped focusing so much on computers, and more on computing. The history of textual technology becomes a lot richer, more plural, more critically provocative, and simply more interesting when computing as a range of socialized practices isn’t fettered to a single essentialized device. (Hence another of my complaints about most definitions of digital humanities: they construe the meaning of digital far too narrowly, even as they attempt to broaden the humanities.)

What, then, can we learn by focusing less on editions and more on agents who carry out the work of editing and archiving? Who are the overlooked editorial and archival agents of the digital age? Is it time to question the perceived status of the scholarly edition as the telos of textual scholarship generally? What do we gain and lose by conceiving of textual scholarship in the form of large digital editing projects, whose shape and aims are so often determined by the needs of funding, infrastructure, and project management? If one is a textual scholar who wants to do new work in digital form, is a large digital scholarly editing project going to be the only game in town?

For me, the answer to this final question is not necessarily, but my work is increasingly focused on all of these questions — and without the distraction of worrying about what a digital scholarly edition is or isn’t.

 

2. Use the word archive carefully, and acknowledge the scholarship of archivists.

The CSE white paper bases much of its argument on what it identifies as the trend “towards the creation of an edition as a single perspective on a much larger-scale text archive.” That distinction needs more thought. Archives are fascinating places, which I’ve learned from researching in them and helping to train future archivists at the University of Toronto’s iSchool. But those who do archival work know that we both gain and lose by embracing the word archive too readily and applying it too broadly. To take an analogy from another field, imagine what any museum curator or museum studies scholar must think whenever they hear a lunch buffet described as “well-curated.” The object of my criticism here isn’t theorists like Derrida, Foucault, or Agamben, who have helped to introduce the archive as a multivalent term in poststructuralism, nor is it the building of the resources that John Bryant calls “digital critical archives” in his post on the CSE blog, which are developing into valuable new forms of textual scholarship. Indeed, the meaning of the word archive has long been a moving target. For what it’s worth, I wrote the first two chapters of my book The Shakespearean Archive as as attempt to bridge the archival, editorial, and poststructuralist understandings of the term, and with no desire to police its meaning.

Rather, the problem is the widespread tendency to invoke the term archive indiscriminately, sometimes without much acknowledgement of its history or of the particulars of archiving and archival research. Whenever we yoke the terms edition and archive together, whether in opposition or on a continuum, we need to remember the conversation between disciplines that should be happening as well. But with the increased currency of the term digital archive in the digital humanities, it’s become all too easy to neglect the fact that archivists, like editors, have a rich and active scholarly literature of their own in journals like Archival Science, Archivaria, and American Archivist, where they sometimes even engage topics in our field like scholarly editing. (See Paul Eggert’s and Kate Theimer’s critiques of similar tendencies from the editorial and archival sides, respectively.)

Failure to engage the scholarship of actual archivists, whether among digital archive-builders or their critics (like the LARB authors), can lead both groups to underestimate the political stakes of the work that archives, and archivists, perform. Like Kirschenbaum, who makes a similar point in his response, the archivists that I work with don’t need to be told that archives are a critical construct in which power is exercised, where ideology quietly shapes the work of memory. Some in their field have already written the book on the subject. The CSE white paper at least avoids active misrepresentation of what archivists do, but the white paper and LARB article alike perpetuate a long-running failure to engage archiving — and archivists — beyond metaphor.

The loss is ours. If more literary scholars, editors, historians, and digital humanists actually read and referenced the scholarly literature of archivists, we’d find a wealth of thinking on concepts like provenance, original order, respect des fonds, the nature of records, authenticity, and the materiality of cultural production. (Here’s a good place to start.) What we would learn from these archival concepts, and from the intellectual traditions behind them, is that archives are edited. That’s a crucial point of context for reading the CSE white paper, which presents an edition as something decanted from an archive, and implies that the completeness of the archive is what allows editions to be selective, critically partial, and intellectually risk-taking. Yet actual archives are equally defined by their incompleteness: missing records, embargoed correspondence, conjectural orderings and groupings of records (which may or may not reflect the actual practice of those who produced the records), and the limitations and strengths of their finding aids (which are created by archivists who are just as human as scholarly editors).

In short, if we’re thinking of an archive, digital or otherwise, as a structure that exists prior to editing and other critical interventions into the documentary record, and which represents completeness rather than selectivity, and which doesn’t have interpretation built into its very bones, then we’re failing to learn something fundamental from archivists themselves. We might still refer to our digital projects as archives — no field or profession owns the term — but it would be a measure of progress if we treated the term as one that our digital projects have to earn.

 

3. Don’t let data become the default term for all digital materials.

As with the indiscriminate use of archive, the same tendency with the word data is one I find especially counterproductive, such that I wrote the final chapter of Shakespearean Archive (“Data and the Ghosts of Materiality”) largely as an argument against the fetishization of the word by some (but not all) digital humanists. The problem is simply this: when the CSE white paper says that “a critical edition … draws its data from [an] archive,” or when, say, a text of Hamlet is referred to as data to be mined, the word data becomes an ontological steamroller that flattens out our discourse. Textual scholars have spent decades, even centuries, developing nuanced language for the materials that we study. Just think of the different resonances of the word text, let alone edition, book, version, witness, record, variant, fragment, imprint, issue, paratext, inscription, or incunable, to name a few. There’s a wealth of nouns, plural and singular, that textual scholars use to name the world of our materials, and to gain some semantic purchase on that world.

And yes, data is another of our plural nouns: bibliographers and book historians were doing quantitative humanistic scholarship decades before it was fashionable in the post-2009 digital humanities, and today one can’t pursue certain questions in these fields without measuring the widths between chain lines, or determining the price of paper, or calculating the financial risks of publishing playbooks, or compiling databases of sales figures, book prices, and reprinting rates. These are the forms of data that my field has been using for decades, along with other kinds of evidence, including texts — all of which require interpretation, and never speak for themselves. Data as a word is hardly something new in my own parts of humanities, contrary to claims made by some DH proponents and critics alike that the concept of data is inherently strange and threatening to humanists.

When we use the word indiscriminately, when any digitally transacted material becomes data in our language and thinking, we face three problems: 1) it becomes easy to forget what any social scientist worth her salt would tell us, which is that data are generated, and they can be generated well or poorly, but they are never just neutral stuff waiting in the ground to be mined; 2) it becomes too easy to avoid the more technically precise terms I’ve mentioned that keep us honest and reflect the ontological diversity of our materials (see above); and 3) we replace those granular terms with a word that, after the mid-twentieth century, cannot be detached from its connotations of positivist and instrumentalist ways of organizing knowledge and labour. On this last point, one imagines the LARB authors would rightly point to data’s use — or, more accurately, co-option — in neoliberal management philosophy.

The history of this problem is too big to elaborate here, but in short, the reasons for data‘s unhelpful ubiquity are partly rooted partly in the history of programming (i.e. the innocently named “data files” that were too big to hold in active RAM), and partly in (yes) the early 21st-century neoliberal university and the language of managerialism. See the articles in Lisa Gitelman’s collection Raw Data Is an Oxymoron for more historical context and substantial questioning of the term, and Christine L. Borgman’s excellent Big Data, Little Data, No Data for a detailed survey of the term’s deeply contextual meanings in various disciplines today.

It’s bad enough when university administrators and academic policy-makers latch onto a term like data and let it do their thinking for them (remember excellence?), but textual scholars — having debated years ago whether bibliography is a science — should understand better than most humanists what’s at stake in letting other fields like computer science define our ontologies for us, even quietly through the casual borrowing of language. As with the word archive, when we say data we should mean it.

 

4. Don’t be too quick to generalize about “the digital age.”

Like print culture and, for that matter, digital humanities, the “digital age” of the white paper’s title has become too easy an object for generalization. Though I use the term myself, I try to avoid it for the same reason that I avoid using “the digital” as a strategically vague adjectival noun (as in “the humanities need to embrace the digital”; digital what, exactly?). One problem with the premise of the CSE white paper is that it begs the question; that is, from its title on down it takes digital technology to be the dominant characterizing feature of scholarly editing in the present. What if it had instead begun with the more open-ended topic of “the scholarly edition in the present”? We would certainly need to discuss digital technology as an agent of change, but it’s not the only one. In 2016, editing and archiving alike are also happening in an age of truth and reconciliation commissions and new understanding of aboriginal and minority literatures, of editing as a cultural practice, of emerging forms of academic and textual labour, of changing models for review and publication, and of new ways of thinking about canons, canonicity, and authorship, to name a few other currents that shape our field in the present. I can understand the need to limit the focus to do justice to a given topic, but when accounting for the changing contexts of scholarly editing in the present, we need to cast the net wider than the CSE white paper does (keeping in mind that it’s just a white paper).

And lest any reader grumpily demand at this point “how does any of this help me identify press variants? just give me my collation software! and stay off my lawn!!” it’s worth pausing to reflect on what editing, bibliography, and other kinds of textual scholarship are for; whose texts we edit, and why (or why not); and what relationship between past and future, mediated by the transmission of texts, that we are helping to shape in our research and teaching. These questions may reasonably lie beyond the scope of the CSE white paper, but neither should we let “the digital age” limit our focus to digital tools when we think about editing in the twenty-first century.

And, finally, on that note…

 

5. Scholarly editing in the present doesn’t just mean using digital tools, and publishing via digital networks, but also editing born-digital texts.

This seems to be a genuine blind spot in the CSE white paper. It makes no direct mention of how editors should face the textual condition (or .txtual condition, as Kirschenbaum calls it) of the late-20th and 21st centuries, which will be overwhelmingly digital in terms of how authors and other agents involved in textual production make the texts we edit and study. Any scholarly editor working on an author from the 1980’s to the present may well be dealing with word processor files (and disc images, and actual discs), and anyone interested in the scholarly editing of electronic literature or video games or e-books or digital records of musical theatre or digital (re)publication of public-domain print texts will obviously be working with very new kinds of materials for textual scholarship. The closest we get to this question in the CSE white paper is a passing reference to N. Katherine Hayles and Jessica Pressman’s useful concept of comparative textual media. But overall the white paper — which, remember, is titled “Considering the Scholarly Edition in the Digital Age” — contains barely a hint that we might need to edit materials from the digital age.

I don’t attribute the white paper’s silence on this topic to conservatism. That would be out of step with the CSE’s position overall, and John Young clearly flags the importance of born-digital artifacts in his earlier CSE blog post. But the silence in the white paper is puzzling. Perhaps it says something about the dominant trends in scholarly editing and digital humanities alike. I can’t help but think again of Greetham’s critique of the Electronic Textual Editing collection, especially his point that “the absence of painting, dance, film, television, video games, music (about which there has been some very challenging critical discussion of late [i.e. 2007]) makes this collection almost relentlessly text- (or linguistics-) based” (135). I’m also reminded of Andrew Prescott’s argument that digital humanities needs to concern itself less with tool-building and tool-use, and more with the study of born-digital artifacts themselves, along the lines D.F. McKenzie advocated. Other fields and organizations such as STS and SHARP (the Society for the History of Authorship, Reading, and Publishing) have been genuinely welcoming to digital textual scholarship. Likewise, David Greetham’s indispensable textbook, Textual Scholarship: an Introduction (which was cited in the white paper), and the recent Cambridge Companion to Textual Scholarship (which wasn’t) both deal head-on with the challenge of the born-digital — the latter in a chapter by Kirschenbaum and Doug Reside with that very phrase as its subtitle. It’s also a promising sign that Kirschenbaum’s new book, Track Changes: a Literary History of Word Processing, will no doubt be at the top of many textual scholars’ reading lists this summer [i.e. 2016].

A study like Kirschenbaum’s could not have been written without digital tools, but perhaps it’s time to question the primacy of the digital tool in DH curricula, funding models, and in our thinking generally. A deliberate focus on digital materials, not just digital tools and editions, would be consistent with the CSE’s evident desire to broaden the way we think about scholarly editing. It would also be consistent with the MLA’s history of championing a broad-minded conception of the materials that presently need to be understood and preserved for future scholarship. (See the 1995 report on the “Significance of Primary Records,” written by an MLA committee chaired by G. Thomas Tanselle.) What we now call digital curation and preservation were only beginning to gain steam in the mainstream scholarly imagination when that report was written (see Jeff Rothenberg’s “Ensuring the Longevity of Digital Information,” first published in Scientific American; note the Shakespeare example!). Today digital curation offers a new way for textual scholars to serve the public good by helping to preserve, understand, and critically interrogate the materials that comprise our digital cultural heritage. In doing so, I suspect textual scholars may find their own answer to Alan Liu’s pointed query, “Where Is the Cultural Criticism in Digital Humanities?”. For me, this mission, far more than passing enthusiasm to hook scholarly editing into big data or social media, is where I find reason to feel excited, motivated, and not a little humbled and challenged by what lies ahead for textual studies.

It’s good motivation to stop writing excessively long blog posts and get back to writing my own next book, tentatively titled The Veil of Code: Studies in Born-Digital Bibliography, and to look forward to teaching courses this coming year that deal with the future of the book and born-digital applications of analytical and historical bibliography. At some point in those classes, we inevitably re-enact one of textual scholarship’s primal scenes: a group of students in a rare book library, gathered around a book or other textual artifact on a table, together using the evidence of our senses to examine the paper stock, readers’ marks, scribal construction of mise-en-page, dog-eared leaves, the bite of letterpress into paper (sometimes vellum), or changing representations of authorship in successive editions. We imagine what questions we might ask of the object, and what questions it asks of us. We consider the relationship between evidence and interpretation. We train our eyes to see how form effects meaning. If we can’t read the text, we slow down and puzzle it out if we can. Some students take pictures on their smartphones, others simply reach out to touch a parchment leaf that still transmits information centuries after a scribe prepared it to receive writing (which is why we only touch the uninked parts of the leaves). And, throughout, there is always a dialectic between the textual artifact on the table and the bustling twenty-first century city street visible just outside the window.

With that imagined scene as context, I’ll end with this small thought experiment: consider what difference it makes to this scene if digital technologies are admitted into it exclusively, or even primarily, as tools, or as data sets, and never as artifacts on the table themselves. What difference does it make for those students, and for the social world outside the library that they rejoin when the class ends? If we are seriously to consider the scholarly edition in the digital age, or the politics of digital textual labour in the neoliberal academy — or even both topics together — then we must consider these questions, too.

Alan Galey
Faculty of Information
University of Toronto