Here’s what I’m working on lately: a presentation for my university’s Interdisciplinary Colloquium.

It has not been uncommon for college instructors to repeat, sometimes word-for-word, the same hype around generative AI that the very companies selling AI have pitched to potential investors. Increasingly, though, I share Ed Zitron’s assessment that the internet is undergoing a process of “economic rot” which he describes as “conditions where we celebrate people for making ‘big’ companies but not ‘good’ companies,” or as Cory Doctorow more pointedly calls it, enshittification.
The more I read about generative AI, the more I find myself aligned with pedagogy scholars who have voiced skepticism about the ongoing panic about it. Gavin P. Johnson invites us to “(re)consider a few things we already know about teaching with and through technology” (Johnson 169), most intriguing of which is that new technologies “do not exist in isolation from cultural practices but rather reflect and reify the practices and ethics of the designers” (170), and that “the never-ending, lose-lose arms race to prevent the crisis of (possible) plagiarism” tends to treat students as hostile would-be criminals, and mutates pedagogy into a form of policing (172). Meanwhile, Sandra Jamieson writes that “A pedagogical response calls on us to trust students; to teach them the work of writing and include AI in the process instead of focusing our efforts on ways to catch those who use AI or reject it as unethical” (Jamieson 156). This includes a reframing of form, genre, structure, and convention.
The problems that generative AI present us with are not problems of cognition, but of articulation. Any creative writer knows this to be true. This is perhaps what Kazim Ali means when writing that a “text is a body because it is made of the same flesh and blood and breath as the writer. The ‘mind’ which declares intention is a collection of senses, sense-responses, and memories. Chemically it is invented in the brain. Thought is matter” (28).
Artificial intelligence is essentially a form of branding for the commercialization of a series of genuinely complex, advanced algorithms that are impressive as far as algorithms go, but the word intelligence is too often mistaken as a synonym for cognizant, just as generative is not the same thing as creative. As Ed Zitron has repeatedly pointed out, programs like ChatGPT don’t actually “know” anything. Instead, in his words,
Modern AI models are trained by feeding them “publicly-available” text from the internet, scraped from billions of websites (everything from Wikipedia to Tumblr, to Reddit), which the model then uses to discern patterns and, in turn, answer questions based on the probability of an answer being correct (Zitron, “Bubble Trouble”).
Peter Elbow asserts that “writing with no voice is dead, mechanical, faceless. It lacks any sound. Writing with no voice may be saying something true, important, or new; it may be logically organized; it may even be a work of genius. But it is as though the words came through some kind of mixer rather than being uttered by a person” (Elbow 287-288). I liken this style of writing to a puppet without a human hand. The language is there, the form is there, the structure and shape are all there, but on its own, it is no different from any other iteration of the same structure.
To what extent is all genre, all formula, all socially constructed literary expectation, not just a form of puppetry? AI writing consists of formulaic estimations of correct form and structure that are recognizably fraudulent without the intervention of a human touch.
As an extension of this metaphor, I want to bring in the 2023 video game Lies of P, a gothic steampunk adaptation of Pinocchio in which the player emerges half-formed in a fictional Victorian city that has created animatronic puppets as a servant class. Because of a malfunction, the puppets turn on their masters.
The player occupies an ambiguous space as a puppet capable of the uniquely human skill of lying. To progress through the game, the player must repeatedly lie about his social authenticity to gain access to human spaces, and this is such a central part of the game that telling the truth even once can change the game’s outcome.
I like this metaphor more than robotics or cyborgs because it gets at the technical accuracy of what students seemingly try to accomplish with the use of AI writing, which is to pass off inorganic thought as their own. We should not teach students to simply imitate collegiate writing, but to write as a reflection of their organic thought processes.
After the creator of the Muppets, Jim Henson, died in 1990, another performer filled the vacuum and animated Kermit the Frog in his place, and viewers recognized the obvious distinctions despite the fact that the puppet was the exact same from one puppeteer to the next. Student writing should be, and I use this word intentionally, revered for its originality in the exact same way. The form of a student essay might not change, but the voice a student brings to the form is in every instance unique, and it is that authenticity that we should help to cultivate, now more than ever before.
Ali, Kazim. “Genre-Queer.” Bending Genre, edited by Margot Singer, Nicole Walker, 2016, pp. 27-28.
Elbow, Peter. Writing With Power. Oxford University Press, 1998.
Jamieson, Sandra. “The AI ‘Crisis’ and A (Re)turn to Pedagogy.” Composition Studies vol. 50, no. 3 (2022), pp. 153-158.
Johnson, Gavin P. “Don’t Act Like You Forgot: Approaching Another Literacy ‘Crisis’ by (Re)Considering What We Know About Teaching with and Through Technologies.” Composition Studies vol. 51, no. 1 (2023), pp.169-175.


If you’re a first-time college instructor, you may have heard this piece of encouraging advice on your first day: “Don’t sweat it!” Well, studies have shown that this is physiologically impossible. In fact, the classroom setting is designed specifically to create more sweat among teachers through a combination of lights, stress, and projectors to overheat the exact spot a teacher teaches in, and nowhere else. As a result, within minutes of teaching, teachers are inevitably drenched in a thin layer of sweat they know their students can see, even those students who spend entire classes with their eyes directed into their phone screens.
My second year of teaching, now in my second Master’s degree, is keeping me busy. Last fall, I took a class on pedagogy and read selections on composition and rhetoric theory by Peter Elbow, David Bartholomae, Janice Lauer, and Paulo Freire. Mostly, though, I learned how to teach by rapidly switching from my role as an instructor to my role as a student, wearing several hats several times a day. This fall, I’m in a similar pedagogy class and teaching similar composition courses, and I find myself learning the basics all over again, with perhaps a better sense of how to fail with grace.
To the astonishment of many, I finished my first semester as a graduate instructor, and I now have a break from graduately instructing people. I have ambitious writing goals for the break (two new stories, four revisions, eight submissions), and I intend to stick to those goals (not just because my nonfiction instructor challenged me to email her if I succeeded), and now that I’ve submitted final grades, I have time to think about my first time being fully responsible teaching forty-six people to write arguments.
You sit down at your desk awaiting students with questions. Some have already sent you emails with one concern or another; they have questions and it’s your job to answer them in office hours. So you wait.
In a week, I’ll be teaching two sections of an introductory English class using a syllabus of my own design, for my graduate program. I can choose the readings, assignments, and discussion topics, all within reason, of course (I probably wouldn’t be allowed to teach my students math; lucky them). While I’ve been a TA and writing tutor before, I’ve never been in charge of a class for a full sixteen weeks. And now I’m charge of two classes.