Writing-Posthuman-The Literary Text As Cognitive Assemblage
Writing-Posthuman-The Literary Text As Cognitive Assemblage
Writing-Posthuman-The Literary Text As Cognitive Assemblage
N. Katherine Hayles
Duke University
Friedrich Kittler has famously argued that the shift from handwriting to
typewriter inscription correlated with a massive shift in how writing, and the voice
it seemed to embody, functioned within the discourse networks of 1800 and 1900,
respectively (Kittler, 1992). We are in the midst of another shift, even more
1900/2000, it would argue that the issue is now not merely the role of the voice but
and China are undergoing complex social, cultural, economic and biological
But I am getting ahead of my story, so let us return to the essay’s title for
further explication. The double slash refers in part to the punctuation mark found in
all URLs, read by computers as indicating that a locator command follows, thus
performing the critical move that made the web possible by standardizing
2
formatting codes rather than devices. For an analogy to highlight how important
this innovation was, we might think of an imaginary society in which every make of
car requires a different fuel. The usefulness of cars is limited by the availability of
stations selling the “right” fuel, a logistical nightmare for drivers, car manufacturers,
and fuel producers. Then someone (historically, Tim Berners-Lee) has a terrific
idea: rather than focus on the cars and allow each manufacturer to determine its
fuel requirements, why not standardize the fuel and then require each manufacturer
to make the vehicle’s equipment compatible with this fuel. Suddenly cars have
increased mobility, able to travel freely without needing to know whether the right
stations will be on their route; cars proliferate because they are now much more
useful; and traffic explodes exponentially. The double slash references the
has wrought.
What about the two terms the double slash connects and divides, “writing”
and “posthuman”? Since I helped to initiate posthuman studies nearly twenty years
ago by publishing the first scholarly monograph on the topic (Hayles 1999),
assumptions about the liberal humanist subject, breaking open the constellation of
urged resistance to those that supposed the biological body could or should be
3
into a computer. The theoretical underpinning for this power fantasy was (and is) a
Two decades later, cultural and theoretical trends have led posthuman
that see positive roles for the posthuman as antidotes to the imperialistic,
Enlightenment. Although a full analysis of these trends is beyond this essay’s scope,
contemporary versions of the posthuman can be roughly divided between those that
see technological innovations as central, and others that focus on different kinds of
forces, structures, and environments to posit the posthuman subject. The catalysts
for these versions are diverse, ranging from the huge influence of Gilles Deleuze’s
over nonhumans and see the posthuman as a way to achieve this, to a growing sense
of crisis over the human impact on earth systems and a corresponding desire to re-
envision the human in posthuman terms. Of course, these categories are far from
Nevertheless, the centrality of technology, or the lack of it, provides a useful way to
group these different versions and locate my own position in relation to them.
Braidotti (Braidotti 2006; 2013). Influenced by Deleuze, Braidotti departs from his
an entity open to desires, intensities and lines of flight that Deleuze evokes as he and
his co-author Guattari write against the subject, the sign, and the organism.
Braidotti urges her readers to become subjects open to flows so intense that they
are almost—but not quite—taken to the breaking point, “cracking, but holding it,
still” (Braidotti, 2006, p. 139) even as the currents rip away most conventional
identifies with his original synthesis between Luhmannian systems theory and
recursive doubling back of a system on itself, in response to and defense against the
posthumanisms, a breathtakingly arrogant move that does not, of course, keep those
who want to include technology from also claiming title to the posthuman.
5
At the other end of the spectrum are visions of the posthuman that depend
on how humans may transform themselves, for example through whole brain
multiple embryos and choosing the one with the most desirable genetic
embryonic stem cells, and then that generation is used to develop more gametes,
each time selecting the most genetically desirable embryos, thus compressing
like Braidotti and Wolfe are primarily concerned with ethical and cultural issues, for
Also on this end of the spectrum is philosopher David Roden (2015), who
might be developed, and what the relation of such life would be to humans. He
humans as we know them, having emerged as a result of what science fiction writer
Vernon Vinge calls the “singularity” (Vinge 1993), the historical moment when
humans invent a life form vastly superior in intelligence and capabilities to their
own powers. He concludes, unsurprisingly given his premises, that humans will be
unable to control the direction or activities of posthuman life and that it is entirely
possible such emergence will mark the end of the human species. While Roden is
perspective and judges posthuman life on its own merits rather than from a human
viewpoint (although one may question how this is possible). His version of the
posthuman finds many reflections in contemporary novels and films, with the
difference that he refuses to speculate on what form posthuman life might take. He
argues that a rigorous philosophical evaluation requires that the future be left
entirely open-ended, since we cannot know what will arise and any prediction we
might make could be wildly off the mark. His stance recalls Yogi Berra’s quip, “It’s
The problem with all these versions of the posthuman, as I see it, is that they
technologies that do not yet exist. While speculation about an unknown future may
Roden’s carful avoidance of predictions about the forms posthuman life might take
approach is to find a middle way that acknowledges the seminal role computational
media play in transforming how we think of human beings, and yet remains
possible to suppose that notions of posthumanism may have arisen earlier, say
before 1950 when computation really took off, I think such ideas would have
everyday experiences made the idea of a historical rupture not only convincing but
obvious, providing the impetus to think of the “human” as something that had
become “post.” Computational media have now permeated virtually every aspect of
My own version has been called as “critical posthumanism” (Roden 2014, pp.
44-45)) and “technological posthumanism” (Cecchetto 2013, pp. 63-92 ). I note that
both of these categories tend to elide the differences between my views and those
because transhumanism usually implies that humans should seek to transform their
that already exist and the ways in which they are transforming how “the human” is
8
performed and conceptualized. My writing is directed not to the future or the past
but the complexities of the hybrid human-technical systems that constitute the
With its sense of marking some kind of rupture in the human, the posthuman
offers opportunities to re-think humans in relation to their built world and to the
planetary ecologies in which all worlds, human and nonhuman, are immersed.
science and other fields, I develop the idea of the cognitive nonconscious, a mode of
reassessment of the role of consciousness in human life in ways that open out onto a
much broader sense of cognition that extends beyond the brain into the body and
and organisms that do not possess consciousness or even central nervous systems,
in all biological lifeforms and technical cognitive media such as computers. The
consciousness is decentered and located within a much deeper, richer, and broader
cognitive ecology encompassing the planet. This is the context in which we should
that of Deleuze and Guattari (1987; see also DeLanda, 2016), although it has some
differences as well. Since Deleuze and Guattari want to deconstruct the idea of pre-
but whereas Latour places humans and material forces on the same plane, my
the one hand and material forces such as tornadoes, hurricanes and tsunamis on the
other. The crucial distinguishing features, I argue, are choice and interpretation,
activities that all biological lifeforms perform but that material forces do not. Choice
and interpretation are also intrinsic to the design of computational media, where
information in contexts that connect it with meaning.” Although space will not allow
me to fully parse this definition here, suffice it to say that I see cognition as a
process, not an entity, and moreover as taking place in specific contexts, which
instantiations for computational media. This definition sets a low bar for something
to count as cognitive; in Unthought, I argue that all lifeforms have some cognitive
capabilities, even plants. Choice (or selection) enters in because it is necessary for
interpretation to operate; if there is only one possible selection, any opening for
cognitive technologies as well as lifeforms and leads to the claim that computers can
Let us turn now to the other term in my title, “writing,” and consider the complex
including how much key pressure is necessary to make the ink impression on paper
dark enough but not so sharp as to tear the paper, and how fast the keys can be hit
without having them jam together. There was no mystery about this process;
everything was plain and open to view, from the mechanical key linkages to the
inked ribbon snaking through the little prongs that held it into place above the
paper. Nowadays, other than scribbled grocery lists and greeting card notes, I write
Here, by contrast, mystery abounds. The surface inscription I see on the screen
most basic level with circuits and logic gates, which in turn interact with microcode
(hardwired or floating), up to machine code and the instruction set architecture for
the machine, which interacts with the system software or operating system, which
+, Java and Python, and finally to executable programs such as Microsoft Word. As I
write, multiple operations within these levels of code are running to check my
spelling, grammar, and sentence structures. In a very real sense, even when I
computers that generate the code necessary for my queries to travel through
cyberspace, and network infrastructures that carry the packets along internet
routes.
connoting by its repetitive mark the folding back of writing about posthumanism
12
cognitive assemblages.
he writes, “evolved under conditions tied to static print media. By contrast, digital
texts change dynamically to suit their readers, political contexts, and geographies . . .
capable of reaching past surface content to reveal platforms and infrastructures that
stage the construction of meaning” (6). He contrasts pen and paper (and I might
“the bridge between keyboard and screen passes through multiple mediating filters.
plywood) the multiple layers of code that comprise it. When printed out, the textual
Motivating Tenen’s concern with digital writing are the “new forms of
copyright restrictions that prohibit the user from accessing the deeper layers of
proprietary code, much less intervening in them. Therefore he argues that to “speak
truth to power—to retain a civic potential for critique—we must therefore perceive
the mechanisms of its codification. Critical theory cannot otherwise endure apart
from material contexts of textual production, which today emanate from the fields of
13
computer science and software engineering” (p. 3). Moreover, in cases where using
favor of “plain text,” text written with a simple editing program as Textedit that uses
minimal coding instructions. By contrast, with pdfs “the simple act of taking notes
becomes a paid feature of the Adobe Acrobat software. What was gained in a
abjure it as much as possible. To his credit, Tenen recognizes that such a stance
risks inconsistency, because while one can avoid using pdfs (with some effort and
inconvenience), one can scarcely make the same kind of decisions regarding, say,
water purification plants, the electrical grid, and modern transportation networks
and still participate in the contemporary world. Tenen defends his argument that
we must understand the laminate text, including even the quantum physics entailed
when it comes to computers,” he asks rhetorically, when one might drive a car
without insisting on knowing exactly how all its components work? He answers,
“Computers . . . are dissimilar to cars in that they are epistemic, not just purely
14
instrumental, artifacts. They do not use get us from point A to point B; they
While I wholeheartedly agree with his conclusion that computers are epistemic, I
am less sure of his assertion about cars, which in their ability to transport rapidly us
from one environment to another might also be considered epistemic, in that they
alter the world horizons within which we live. The better argument, I think, is to say
that Tenen is especially concerned with computers because he writes for a living,
and hence writing to him is more than instrumental: it is a way of life and a crucial
component of how he knows the world and conveys his thoughts to others.
His approach, therefore, has special relevance to literary theory and criticism.
One of his central claims is that the underlying layers of code “affect all higher-level
interpretive activity” (2), so that literary analysis can no longer remain content with
reading surface inscriptions but must plunge into the code layers to discern what
operations are occurring there. This concern is reflected within the emerging field
of electronic literature in a rich diversity of ways. Here I should clarify that whereas
specifically to digital writing that can lay claim to the affordances and resources of
An analogy with artists’ book may be helpful here. Johanna Drucker, in her
the special characteristic of artists’ books that distinguish them from books in
general is their mission to interrogate the book as an artistic form. What makes a
book, and how far can be boundaries be stretched and have the object still
15
recognized as “book”? What about a book with marble covers and no pages, or a
book with many different kinds of paper but no ink (described in Hayles 2002)?
Such projects draw our attention to our presuppositions about what a book is and
Similarly, electronic literature asks how far the boundaries of the “literary” may
be stretched and still have the work perform as literature, for example in works
where words are fused with, and sometimes replaced by, animations, images,
sounds, and gestures. How do our ideas about literature change when the text
including even breathing (see for example Kate Pullinger’s iPhone story, “Breathe,”
2018). These questions imply that electronic literature is a special form of digital
writing that interrogates the status of writing in the digital age by using the
resources of literature, even as it also brings into question what literature is and can
be.
3. Post//Code//Human
To see how works of electronic literature encourage users to plunge into the
underlying code, I turn to Sea and Spar Between, a collaborative work authored by
Nick Montfort, a Ph.D. in computer science who also creates works of generative
poetry, and Stephanie Strickland, a prize-winning poet who works in both print and
electronic media. The authors chose passages from Melville’s Moby Dick to combine
with Emily Dickinson’s poems; the program’s code combines fragments from the
neologisms with one syllable from Dickinson and the other from Melville. The
the code and constructed the database from which the algorithms draw for their
recombinations. The screenic display is a light blue canvas on which the quatrains
appear in a darker blue, color tones that metaphorically evoke the ocean with the
with 14992383 positions, resulting in about 225 trillion stanzas, roughly the
amount, the authors estimate, of fish in the sea. As Stuart Moulthrop points out
(Moulthrop and Grigar, 2017, p. 35) the numbers are staggering and indicate that
the words displayed on a screen, even when set to the farthest zoom-out setting, are
Figure 1. Screen shot of Sea and Spar Between, at closest zoom. Image courtesy of
Figure 2. Screen shot of Sea and Spar Between, taken at medium zoom. Image
Figure 3. Screen shot of Sea and Spar Between, farthest out zoom. Image courtesy of
fish’s rapid and erratic flashings as it swims. It is possible to locate oneself in this
bottom. This move will result in the same set of words appearing on screen as were
previously displayed at that position. Conceptually, then, the canvas pre-exists in its
entirety, even though in practice, the very small portion displayed on the screen at a
given time is computed “on the fly,” because to keep this enormous canvas in
their comments when they remark that the work signals “an abundance exceeding
authors reinforce the idea of a reader lost at sea in their essay on this work, “Spars
of Language Lost at Sea” (Montfort and Strickland 2013). They point out that
randomness does not enter into the work until the reader opens it and begins to
read. “It is the reader of Sea and Spar Between who is deposited randomly in an
ocean of stanza each time she returns to the poem. It is you, reader, who are
How does the work invite the reader to plunge into the underlying code? The
invitation takes the form of an essay that the authors have embedded within the
source code, marked off by double slashes (yet another connotation of my title), the
statements). The essay is entitled “cut to fit the toolspun course,” a phrase
generated by the program itself. The comments make clear that human judgments
played a large role in text selection, whereas computational power was largely
//most of the code in Sea and Spar Between is used to manage the
//interface and to draw the stanzas in the browser’s canvas region. Only
//2609 bytes of the code (about 22%) are actually used to combine text
//fragments and generate lines. The remaining 5654 bytes (about 50%)
By contrast, the selection of texts was an analog procedure, intuitively guided by the
generating //lines. We did this not quantitatively, but based on our long
sections of the code and also provides a commentary on the project itself,
functioning in this role as literary criticism. The comments, plus the code they
explicate, make clear extent of the computer’s role as collaborator. The computer
knows the display parameters, how to draw the canvas, how to locate points on this
two-dimensional surface, and how to center a user’s request for a given latitude and
longitude. It also knows how to count syllables and what parts of words can
combine to form compound words. It knows, the authors comment, how “to
generate each type of line, assemble stanzas, draw the lattice of stanzas in the
browser, and handle input and other events.” That is, it knows when input from the
user has been received and it knows what to do in response to a given input. What
it does not know, of course, are the semantic meanings of the words and the literary
authors outline what they see as the user’s involvement as responder and critic.
—//accomplishes this task. But in another, and likely more novel, way,
To the extent that the “new constitution” could not be implemented without the
computer’s knowledge, intentions and beliefs, the computer becomes not merely a
device to display the results of human creativity but a collaborator in the project. By
enticing the user/reader to examine the source code, the authors make clear the
cognitive assemblage activated when a reader opens the work on her computer and
4. Evolving//Posthuman
Sea and Spar Between does not invoke any form of artificial intelligence, and
differs in this respect from my next example, which does make such an invocation.
//These rules [governing how the stanzas are created] are simple; there is
no elaborate AI architecture
Hå kan Jonson, takes the computer’s role one step further, from collaborator to co-
22
creator, or better perhaps poetic rival, programmed to erase and overwrite the
words of the Heldén’s original. Heldén is a true polymath, not only writing poetry
but also creating visual art, sculpture, and sound art. His books of poetry often
contain images, and his exhibitions showcase his work in all these different media.
Jonson, a computer programmer by day, also creates visual and sound art, and their
collaboration on Evolution reflects the authors’ multiple talents. The authors write
in a preface that the “ultimate goal” of Evolution is to pass “’The Imitation Game’ as
proposed by Alan Turing in 1951. . . when new poetry that resembles the work of
when framed by the actual workings of the program. In the 2013 version, the
authors input into a database all ten of the then-extant print books of Hélden’s
poetry. A stochastic model of this textual corpus was created using a statistical
model known as a Markov Chain (and the corresponding Markov Decision Process),
a discrete state process that moves randomly step-wise through the data, with each
next step depending only on the present and not on any previous states.
are then evaluated according to some fitness criteria, and one is selected as the most
“fit.” In this case, the fitness criteria are based on elements of Heldén’s style; the
idea is to select the “child” algorithm whose output most closely matches Heldén’s
23
own poetic practices. Then this algorithm’s output is used to modify the text, either
replacing a word (or words) or changing how a block of white space functions, for
example, putting a word where there was white space originally (all the white
spaces, coded as individual “letters” through their spatial coordinates on the page,
The interface presents as a opened book, with light grey background and
black font. On the left side is a choice between English and Swedish and a slider
controlling how fast the text will evolve. On the right side is the text, with words
and white spaces arranged as if on a print page. As the user watches, the text
changes and evolves; a small white rectangle flashes to indicate spaces undergoing
time the program is opened, one of Heldén’s poems is randomly chosen as a starting
point, and the display begins after a few hundred iterations have already happened
(the authors thought this would be more interesting than starting at the beginning).
25
At the bottom of the “page” the number of the generation is displayed (starting from
Also displayed is the dataset used to construct the random seed. The dataset
changes with each generation, and a total of eighteen different datasets are used,
per episode of Twin Peaks” ( Evolution [print book] 2014, n.p.) These playful
astronomical data, suggesting that the evolutionary process can be located within
slightly varying drone, is generated in real time from sound pieces that Heldén
previously composed. From this dataset, one-minute audio chunks are randomly
selected and mixed in using cross-fade, which creates an ambient soundtrack unique
The text will continue to evolve as long as the user keeps the screen open,
Heldén’s words with those of the algorithm. One could theoretically reach a point
where all of Heldén’s original words have been replaced, in which case the program
would continue to evolve its own constructions in exactly the same way as it had
limited edition print book (Evolution [print book] 2014), in which all the code is
printed out. In this sense, the book performs in a print medium the same gambit we
26
saw in Sea and Spar Between, mixing interpretative essays of literary criticism
together with algorithms so that the reader is constantly plunged into the code, even
if her intent is simply to read about the work rather than reading the work itself.
Moreover, the essays appear on white pages with black ink, whereas the code is
displayed on black pages with white ink. The inverse color scheme functions as a
material metaphor that suggests the Janus-like nature of digital writing, in which
one side faces the human and the other, the machine.
The essays, labeled as appendices but placed toward the book’s beginning
and interspersed with the code sections most germane to how the program works,
Cayley, Maria Engberg, and Jesper Olsson. Cayley (2014), as if infected by the work’s
aesthetic, adopts a style that evolves through restatements with slight variations,
thus performing as well as describing his interactions with the work. In Appendix 2:
“Breath,” he suggests the work is “an extension of his [Heldén’s] field of poetic life,
privileged human processes proceeding through the mind and body of Heldén and
linguistic artifacts that are generated by compositional process such that they may
never actually be—or never be able to be . . . read by any human having the mind
“troubled,” and “never actually be/never be able,” but each time in a new context
27
that slightly alters the meaning. When Cayley speaks of being “troubled,” he refers
whereas the human needs to sleep, eat, visit the bathroom, the machine can
continue indefinitely, not having the same kind of “mind and body” as human
human minds and bodies can contain, recalls the excess of Sea and Spar Between and
gestures toward the new scales possible when computational media become co-
creators.
Evolution to the John Cage’s aesthetic, as indicated by her title citing Cage’s famous
name for his randomizing algorithms. She quotes from another essay by Cayley (not
in this volume) in which he calls for an emphasis on process over object. “’What if
of writing and reading rather than dwelling on either textual artifacts themselves
artifacts?’” In Evolution, the code underlying the screenic artifact is itself a series of
also sees an analogy in Cage’s work, commenting that the work “was not the poet
expressing himself. He was at best a medium for something else.” This “something
processes so that it can write like Heldén, albeit without the “mind and body” that
28
produced the poetry in the first place. A disturbing analogy comes to mind: H. G.
Wells’ The Island of Doctor Moreau and the half-human beasts who keep asking, “Are
we not men?” In the contemporary world, the porous borderline is not between
coding writing programs” as “an attempt to align the subject with the world, to
negotiate the differences and similarities between ourselves and the objects with
which we co-exist.”
penetrated complex human systems that it has become our “natureculture,” Jonas
Ingvarsson 2014), that the “signs are all over Heldén’s poetic and artistic output.
Computer supported lyrics about nature and environments, graphics and audio
of Things”). Although he does not use the phrase, the sensibility he describes fits
had an opportunity to ask him when (with Danuta Fjellestad) I met Heldén, Jonson,
Cage, Heldén remarked that he felt “relieved,” as if a burden of subjectivity had been
lifted from his shoulders. He recounted starting Evolution and watching it for a long
29
time. At first he amused himself by thinking “me” or “not me” as new words
appeared on screen. Soon, however, he came to feel that this was not the most
interesting distinction he could make; rather, he began to see that when the
program was “working at its best,” its processes created new ideas, conjunctions,
and insights that would not have occurred to him (this is, of course, from a human
point of view, since the machine has no way to assess its productions as insights or
ideas, only as more or less fit according to criteria based on Heldén’s style). That
this fusion of human and machine intelligence could produce something better than
either operating alone, he commented, made him feel “joyous,” as if he had helped to
bring something new into the world based on his own artistic and poetic creations
He remarked that after he and Jonson had created Evolution, he found his
own poetic practice mutating in response to what he had seen the program produce.
assemblage in all directions. Articulations first flow from human brains and bodies
into machines, then back again from machines to humans, in a recursive system of
circular causality that has profoundly changed how we think of “the human.”
Literature//Cognitive Assemblage
mistake things for animate actors, we further diminish our capacity for critical
30
analysis and collective action” (Tenen, p. 11). In several places, Heldén and Jonson
genetic algorithms are not intelligent at all; they know nothing about the semantics of the
work and operate through procedures that are in principle relatively simple
(acknowledging that the ways random “seeds” are used and fitness criteria are developed
and applied in this work are far from simple, not to mention the presentation layers of
code). The power of genetic algorithms derives from finding ways to incorporate
evolutionary dynamics within an artificial medium, but like many evolutionary processes,
they are not smart in themselves, any more than are the evolutionary strategies that
animals with tiny brains like fruit flies, or no brain at all like nematode worms, have
developed through natural selection. When I asked Jonson about this objection, he
indicated that for him as a programmer, the important part was the more accurate
algorithms” (“The Algorithm,” Evolution [print book], n.p.). Whether this counts as
His lack of concern with the work’s simulative nature indicates a major
difference between literary culture as it has traditionally developed and the new
condition of the literary text as cognitive assemblage. Long before the twentieth
about 1980, the Jameson tradition of symptomatic reading (Jameson 1982) has been
a dominant trend in literary criticism. Recent calls for “surface reading” by Stephen
Best and Sharon Marcus (2009) have sought to call attention again to a work’s
aesthetic qualities, refusing the assumption that the surface is an alibi for an
31
emerges between surface and depth, specifically the interplay between the screen as
the site of inscription and the code that generates it. “We can no longer use
writes. “The praxis of close reading must reach down to the silicon bedrock:
material entities, and the physical structures that bear the weight of interpretation”
(p. 12).
While I am entirely sympathetic with this view, I would add that the kind of
close reading Tenen urges becomes even more powerful when contextualized
within larger cultural, social, economic and technological transformations that have
catalyzed posthuman studies. To grasp more fully the role of literature in an era of
digital writing, we must attend not only to its thematic and aesthetic qualities but to
the computational media intrinsic to its production and operation. Moreover, the
critiques even as it also performs it, in a reflexive doubling connoted by the double
texts are produced and disseminated, even when the output form they take is a print
book. When literary texts take the specific form of electronic literature, they both
the Electronic Book Review, with the title “Literary Texts as Cognitive Assemblages:
The Electronic Literature Organization, founded in the late 1990s to promote and
disseminate electronic literature, has wrestled with the difficult issue of how to define
electronic literature in ways that include recognized works in this diffuse field, some of which
may lack words but use animations, gestures, and other means to convey what one might call
“literary” experiences, while still excluding such mundane writing practices as pdfs and urls. I
recommend a visit to the ELO website for readers who want further clarification of this issue,