AI Snake Oil: Why I Stopped Trusting the Magic Show

I will tell you something that might sound silly at first: I used to believe in AI the same way people believe in unicorns, or diet pills, or that weird machine on late-night TV that promises rock-hard abs while you sit and eat chips. I believe that AI would save me time, make me smarter, polish my writing, analyze my research, help my students, and probably teach my classes while I napped. I really believed that. But here is the truth: it did not. It does not. And I am here to say it loud—I have stoped drinking the AI Kool-Aid, and guys, was it spiked with some sweet, slippery snake oil. See, AI came dressed up in a glittery jacket, threw around words like ‘efficiency,’ ‘automation,’ and ‘pedagogical revolution’ and made a lot of us clap like excited seals. I clapped too. Who would not want a shiny machine that could write lesson plans, grade essays, generate research questions, summarize books, cite sources, and whisper sweet academic nothings in your ear while you eat leftover spaghetti in front of a blinking cursor? But after the sparkle faded, I realized something: AI is not adding anything substantial to the real, deep, hard, delicious, frustrating, and soulful work of teaching or researching rhetoric. It’s like putting glitter on cardboard and calling it a Faberge egg.

Fig I: AI generated Image of a Unicorn that symbolizes epistemic purity and AI Snake Oil

Let me explain. I asked AI to help me brainstorm research questions. It gave me 10 questions that sounded like they were copied from a textbook written by a robot who would never read a real book. “How does digital rhetoric influence online learning environments? Wow! Groundbreaking! My cat could think of that. And she cannot even use a mouse without getting distracted by the screen saver. I needed curiosity, I needed fire. I got tepid bathwater. Then I asked AI to help me with student feedback. I thought maybe it could draft a few encouraging lines could personalize. What I got sounded like something from a sad greeting card factory where the writers had been replaced with soulless toasters. “Good job. Keep up the hard work.” Thanks. That’s the kind of thing that makes a student feel like a barcode. I tried to give AI a second chance. Maybe it was just having a bad data day. So I fed it more context. I told it the student was working on tone in professional emails. The response? “Try to be professional and use appropriate tone.” That’s like telling a chef, “Try not to burn it.” Thanks for the revolutionary insight. But I did not stop there. I went full nerd. I gave AI a complex rhetorical thoery prompt and asked it to draft a paragraph. What came back looked like a bored undergrad had Googled “rhetorical analysis” and copy-pasted the first paragraph of Wikipedia. I mean, sure, it had all the right words—logos, ethos, kairos—but it was all foam and no coffee. All bark, no bite. All sprinkle, no donut.

I began to wonder: what exactly is AI adding to the value chain of my research? Of my pedagogy? Of my rhetorical practice? The answer I arrived at—with a dramatic sigh and a slightly wilted sandwich in my hand—was: not much. Not yet. Maybe not ever. Because what I needed as a teacher, a writer, a thinker, a human—is not a sterile stream of regurgitated content. I need nuance. I need context. I need slowness. I need error. I need a student staring off into space, wrestling with an idea, and then lighting up like a firefly when it finally clicks. I need the mess. I love the mess. AI does not do mess. AI does averages. It smoothes everything out until nothing sticks. Nothing cuts. Nothing bleeds.

Let me say something that might get me kicked out of the 21st century: AI is not a collaborator. It is not a co-author. It is not a co-teacher. It is not a magical oracle of Delphi with a USB port. It is a calculator with a thesaurus. And sometimes it is a hallucinating calculator who makes up stuff and says  it with confidence, like that one kid in class who did not do the reading but still raises their hand. “But it is just a tool!” people say. Sure. So is a hammer. But if you use hammer to wash your dishes, your cups are going to cry. And that is the thing: AI is being used in the wrong rooms, for the wrong reasons, with the wrong expectations. We are asking it to inspire, to create, to feel, to reflect. But that is not what it does. What it does is imitate. And imitation, as far as I know, has never written a good poem, designed a good syllabus, or made a student feel truly seen.

Fig II: AI generated image of the Oracle

Let me give you a juicy example. I once asked AI to generate a short dialogue between Socrates ad Beyonce. Do not ask why. Just go with me. The result was a biege, baffling, boring exchange where Socrates said things like, “What is truth?” and Beyonce said, “Let’s empower women.” It was like watching a mime reenact philosophy night at Karaoke. No rhythm, no soul, no sass. Another time, I asked AI to help me generate metaphors for rhetoric. It gave me, I kid you not: “Rhetoric is like a bridge. It connects people.” Really? That is the best it could do? A bridge? I wanted fireworks. I wanted “Rhetoric is a mischievous racoon in a library of sacred scrolls.” Or “Rhetoric is a con artist with a PhD and a velvet tongue.” Something with some flair—some flavor—some garlic.  Instead, I got what AI always gives me: the blandest possible answer that no one will remember five minutes later.

So now, when someone says AI is transforming education, I tilt my head like a confused dog. Transforming it into what? A box of stale crackers? I am not saying AI can not do cool tricks. It can summarise articles. It can generate citations (sometimes fake ones, but hey, we have all had bad days). It can give you a to-do list. But so can a Post-it note. And Post-its do not pretend they are going to replace me. Because the magic of teaching—real teaching—is not just about information delivery. It is about relationship. It is about intuition. It is about awkward silences and big questions and the electric jolt when someome’s idea leaps off the page like it grew wings. AI can not do that. And let’s be honest, most of the time, it is not even trying.

The other day, a student told me, “I asked ChatGPT for help and it gave me a pretty good answer, but I still did not get it.” That is the whole point. Good teaching is not about answers. It’s about ways of thinking. It’s about questions that unravel you and slowly put you back together. AI does not know how to not know. It does not wrestle. It does not wonder. It just spits.

So I have decided: I am staying messy. I am staying human. I am keeping my sarcasm, my pauses, my sweaty palms, my failed metaphors, my joyful rambling, and my stubborn refusal to believe that a machine that has never loved or lost can teach anyone what it means to write well or think hard or care deeply. AI is fine for what it is—a tool. A digital Swiss army knife that sometimes forgets it is holding a spoon. But it is not the future of teaching. It is not about the soul of rhetoric. And it is definitely not the secret sauce of research. The sauce is still us: the long walks, the quiet mornings, the random napkin notes. The student who makes a joke that surprises you. The sentence that hits you so hard you stop and read it twice. That is real. That is deep. That is not artificial. That is the good stuff.

Therefore, let the AI talk. Let it type. Let it generate. I will be over here, with my pen, my paper, my voice, my students, my questions, and my beautiful, wild, irreducible human brain—doing the real work.

No snake oil is necessary.

Leave a comment