created 2025-02-18, & modified, =this.modified
rel: Oulipo Spliced Book - Alternative Reading Methods
Abstract:
Since the rise of mainframe computing, literary authors and critics alike have expressed anxiety about the computer’s ability to write narrative prose and poetry as well as or better than humans. With the recent emergence of publicly accessible AI authoring platforms such as ChatGPT these fears may appear to have been well-founded. In this article I will situate contemporary digital literary practices of reading, writing, rewriting, and performing computer- generated variable texts within broader social and historical contexts. Experimentation with generative, permutational, and combinatory text began long before digital computers came into being. How and why does experimentation with generative or in other ways variable text emerge within certain human and machinic generations? How do our attempts to make writing machines help us understand how we write ourselves?
Two and a half human generations have passed since the first generation of mainframe computers emerged in the 1940s.
Pre-digital
Carmen XXV, a work of the fourth-century poet Optatianus Porfyrius. in which lists of words written in four columns may be arbitrarily combined by the reader to create “1.62 billion possible permutations of the text”
The result is a self-reflexive text which, as Cramer states, “jumbles its own words, performs and confuses itself simultaneously”
Swift discusses a machine for improving speculative knowledge by practical mechanical operations.
Six hours a day the young students were employed in this labour; and the professor showed me several volumes in large folio, already collected, of broken sentences, which he intended to piece together, and out of those rich materials, to give the world a complete body of all arts and sciences … “the most ignorant person at a reasonable charge, and with a little bodily labour, may write books in philosophy, poetry, politicks, law, mathematicks and theology, without the least assistance from genius or study”
Mainframe
Roald Dahl: “The Great Automatic Grammatizer” in which machine writes fiction so well that it dominates the field of publishing.
Turing: “The whole thinking process is still rather mysterious, but I believe that the attempt to make a thinking machine will help us greatly in finding out how we think ourselves.”
Stochastic texts
1959, Theo Lutz began experimenting with random variation in texts. The text generators were based on the logical structures of mathematics. The first computer with a core memory based on magnetic storage tape, the Zuze Z22 computer was particularly well-suited to random variation.
The Z22 is especially suited to applications in extra-mathematical areas. It is particularly suited to programs with a very logical structure i.e. for programs containing many logical decisions. The machine’s ability to be able to print the results immediately, on demand, on a teleprinter is ideal for scientific problems
The Castle
Lutz’s most oft-quoted generator, The Castle (1959), basks in a reflected fascination with its source text, The Castle (1926), an unusual novel written by hand by Franz Kafka, a now well-known but in his own time unpublished and utterly obscure literary author of a previous generation. The print text of the novel The Castle that we now think of as definitive was unfinished at the time of Kafka’s death in 1924. Kafka’s friend and mentor Max Brod edited the novel heavily before publishing it posthumously, against the author’s wishes, in 1926. The formal structure of The Castle lends itself to random variation. The unobtainable goal of an ending is inherent in this text. But what if Lutz had selected a different source text by a lesser- known author, or a text from outside of literature altogether? Given the scientific terms he used to describe his own work, would digital literary critics still consider it literary? Although largely rhetorical in nature, one pragmatic approach to following this line of questioning further would be to attempt to apply Lutz’s computational processes to a different source text. Towards this end, remix as a research method will be discussed later on in this article.
Digression
In digging deeper to this I found this from 2008 Lutz in his essay wrote:
It seems to be very significant that it is possible to change the underlying word quantity into a “word field” using an assigned probability matrix, and to require the machine to print only those sentences where a probability exists between the subject and the predicate which exceeds a certain value. In this way it is possible to produce a text which is “meaningful” in relation to the underlying matrix.
with an analysis One predominant domain of AI research follows this thread suggested by Lutz: statistical probability. In addition Lutz’ notion implies the matrice of language is analogous to a network and that proximal sets may evoke meaningful relations, or perhaps that meaning is a pathway between mathematically linked nodes. All of these notions are still currently active as research paths.
Love Letters
rel:
The Love Letter Generator That Foretold ChatGPT
Christopher Strachey’s Love Letter generator. Strachey was of almost the same generation as Alan Turing, but not quite. Both were brilliant code breakers and code makers, math puzzlers, and playful experimenters, and both were lonely gay men muzzled by post-war retrenchment of homophobic conservative values.
DARLING LOVE
YOU ARE MY AVID FELLOW FEELING. MY AFFECTION CURIOUSLY CLINGS TO
YOUR PASSIONATE WISH. MY LIKING YEARNS FOR YOUR HEART. MY TENDER
LIKING. YOU ARE MY WISTFUL SYMPATHY.
YOURS LOVINGLY,
M.U.C.
Those doing real men’s jobs on the computer, concerned with optics or aerodynamics, thought this silly, but it was as good a way as any of investigating the nature of syntax, and it greatly amused Alan and Christopher Strachey – whose love lives, as it happened, where rather similar too.
He created a system of deliberate simplicity. It failed intentionally, and humorously.
I see the love letter generator, not as a process for producing parodies, but as itself a parody of a process. The letters themselves are not parodies of human-authored letters; rather, the letter production process is a parodic representation of a human letter-writing process. It is not a subtle parody, driven by a complex structures that circuitously but inevitably lead, for example, to the same small set of vapid sentiments stored as data. Rather it is a brutally simple process, representing the authoring of traditional society’s love letters as requiring no memory, driven by utterly simple sentence structures, and filled out from a thesaurus. The love letter generator, in other words, was as complex as it needed to be in order to act out a parody.