Works in Progress, 2: Cyborgs, Puppets, and AI Writing

Here’s what I’m working on lately: a presentation for my university’s Interdisciplinary Colloquium.

Left: Geppetto and Pinocchio, Bemporad & Figlio, Firenze 1902. Right: Jim Henson and Kermit, 1979.

It has not been uncommon for college instructors to repeat, sometimes word-for-word, the same hype around generative AI that the very companies selling AI have pitched to potential investors. Increasingly, though, I share Ed Zitron’s assessment that the internet is undergoing a process of “economic rot” which he describes as “conditions where we celebrate people for making ‘big’ companies but not ‘good’ companies,” or as Cory Doctorow more pointedly calls it, enshittification.

The more I read about generative AI, the more I find myself aligned with pedagogy scholars who have voiced skepticism about the ongoing panic about it. Gavin P. Johnson invites us to “(re)consider a few things we already know about teaching with and through technology” (Johnson 169), most intriguing of which is that new technologies “do not exist in isolation from cultural practices but rather reflect and reify the practices and ethics of the designers” (170), and that “the never-ending, lose-lose arms race to prevent the crisis of (possible) plagiarism” tends to treat students as hostile would-be criminals, and mutates pedagogy into a form of policing (172). Meanwhile, Sandra Jamieson writes that “A pedagogical response calls on us to trust students; to teach them the work of writing and include AI in the process instead of focusing our efforts on ways to catch those who use AI or reject it as unethical” (Jamieson 156). This includes a reframing of form, genre, structure, and convention.

The problems that generative AI present us with are not problems of cognition, but of articulation. Any creative writer knows this to be true. This is perhaps what Kazim Ali means when writing that a “text is a body because it is made of the same flesh and blood and breath as the writer. The ‘mind’ which declares intention is a collection of senses, sense-responses, and memories. Chemically it is invented in the brain. Thought is matter” (28).

Artificial intelligence is essentially a form of branding for the commercialization of a series of genuinely complex, advanced algorithms that are impressive as far as algorithms go, but the word intelligence is too often mistaken as a synonym for cognizant, just as generative is not the same thing as creative. As Ed Zitron has repeatedly pointed out, programs like ChatGPT don’t actually “know” anything. Instead, in his words,

Modern AI models are trained by feeding them “publicly-available” text from the internet, scraped from billions of websites (everything from Wikipedia to Tumblr, to Reddit), which the model then uses to discern patterns and, in turn, answer questions based on the probability of an answer being correct (Zitron, “Bubble Trouble”).

Peter Elbow asserts that “writing with no voice is dead, mechanical, faceless. It lacks any sound. Writing with no voice may be saying something true, important, or new; it may be logically organized; it may even be a work of genius. But it is as though the words came through some kind of mixer rather than being uttered by a person” (Elbow 287-288). I liken this style of writing to a puppet without a human hand. The language is there, the form is there, the structure and shape are all there, but on its own, it is no different from any other iteration of the same structure.

To what extent is all genre, all formula, all socially constructed literary expectation, not just a form of puppetry? AI writing consists of formulaic estimations of correct form and structure that are recognizably fraudulent without the intervention of a human touch.

As an extension of this metaphor, I want to bring in the 2023 video game Lies of P, a gothic steampunk adaptation of Pinocchio in which the player emerges half-formed in a fictional Victorian city that has created animatronic puppets as a servant class. Because of a malfunction, the puppets turn on their masters.

The player occupies an ambiguous space as a puppet capable of the uniquely human skill of lying. To progress through the game, the player must repeatedly lie about his social authenticity to gain access to human spaces, and this is such a central part of the game that telling the truth even once can change the game’s outcome.

I like this metaphor more than robotics or cyborgs because it gets at the technical accuracy of what students seemingly try to accomplish with the use of AI writing, which is to pass off inorganic thought as their own. We should not teach students to simply imitate collegiate writing, but to write as a reflection of their organic thought processes.

After the creator of the Muppets, Jim Henson, died in 1990, another performer filled the vacuum and animated Kermit the Frog in his place, and viewers recognized the obvious distinctions despite the fact that the puppet was the exact same from one puppeteer to the next. Student writing should be, and I use this word intentionally, revered for its originality in the exact same way. The form of a student essay might not change, but the voice a student brings to the form is in every instance unique, and it is that authenticity that we should help to cultivate, now more than ever before.


Ali, Kazim. “Genre-Queer.” Bending Genre, edited by Margot Singer, Nicole Walker, 2016, pp. 27-28.

Elbow, Peter. Writing With Power. Oxford University Press, 1998.

Jamieson, Sandra. “The AI ‘Crisis’ and A (Re)turn to Pedagogy.” Composition Studies vol. 50, no. 3 (2022), pp. 153-158.

Johnson, Gavin P. “Don’t Act Like You Forgot: Approaching Another Literacy ‘Crisis’ by (Re)Considering What We Know About Teaching with and Through Technologies.” Composition Studies vol. 51, no. 1 (2023), pp.169-175.

Leave a comment