I let my students use AI to mediate essay writing. What happened was unexpected.
The didn’t use AI to coach their writing.
2023 was a wild time for AI in public consciousness. I remember the tense and lively staffroom conversations at my high school workplace. We teachers couldn’t see what this sudden and spiralling change meant for education. But as good teachers, at a school that was technologically driven – we wanted to see if AI could augment and enhance education, not just destroy it.
The destruction was already evident. Students were turning in work that was stratospherically above their normal language levels. I could tell homework was being prompted a few seconds before class. The kids already relied on ChatGPT to cheat. They weren’t adept yet at hiding their cheating (the first response was pasted and onwards they blissfully went), but it was obvious to me and my colleagues that things were changing, and could spiral.
For an English teacher like me, ChatGPT was particularly alarming. At high school level, language curricula emphasize the essay form, particularly the academic essay. The logic has held for decades: writing a complex, cognitively dense, stylistically challenging piece requires critical thinking, research skills, wide stylistic range and command of academic diction. We can assess those skills simultaneously in one essay task. The act of preparing and writing a piece is didactic. Essays have been the gold standard for formative and summative assessment for years. They’re used extensively for college admissions. Universities expect high schoolers to enter tertiary education with a passing competence at academic essay writing – which is, after all, the backbone of academic publishing.
But ChatGPT had come for the essay. And obliterated a key technique I relied on to help my students become critical and provocative thinkers. We could still make kids write essays in class by hand, under supervision, but that meant the tasks had to be completed in short periods. It wasn’t the same.
I was inspired by an interview on the New York Times’s Hard Fork podcast. A high school English teacher said something that struck me: ChatGPT lets all students start an essay with a baseline skeleton, regardless of their writing proficiency. It can level the playing field for advanced work.
For context, let me explain what often happens in an English classroom when we’re crafting essays. Many students struggle with what we might call lower-order linguistic skills, even in high school. When they write essays on challenging topics, they’re focused on the mechanics of syntax. Their vocabulary is too narrow to give them real expression. When they get feedback on their work, it draws their attention to these problems. Other students are more confident with the basics. They’ve progressed to working on nuanced expression, semantics, discourse-level organization and the manipulation of language for effect. The idea from the podcast was that ChatGPT gave all students the chance to begin with a passable draft of writing, forcing everyone to grapple with higher-order editing skills. It’s similar to how calculators let kids who struggle with arithmetic still engage with advanced mathematical concepts.
I wanted to try this. I had a reflective essay due in my curriculum soon, so I decided to refashion the task as AI-mediated writing. I chose the reflective essay because it would force students to include at least some personal content (getting AI to reflect on their lives would be too fake to justify cheating on that score, I thought), but also because it was a low-stakes assessment. I had room to experiment.
The brief was that students had to use ChatGPT for at least one of the following: brainstorming, getting feedback on a first draft, converting plain text to imagery or figurative language, parsing a sentence or paragraph through multiple rounds of prompts to engineer the desired improved outcome, or making adjustments to the intensity of imagery or phrasing, or the degree of reliance on stock phrasing.
What followed was unexpected.
First, it took a while for everyone to work out how to use ChatGPT without just pasting the question paper into the prompt.
But then the class divided into two camps.
The first group were those who tended to struggle with writing generally. They latched onto AI quickly, but they didn’t know what to do other than have the system spew out an entire essay. So that’s what they did. Then they just read the result, said it looked good and asked if they could submit.
They had to work out what help they needed in the process, how their work might be lacking and what language was needed to trigger a response that fixed the identified issues. For these students, each of those components was very difficult. They had seldom been in a position in English where they had to self-assess their own writing. Even if they knew generally the kind of feedback they got from me, they didn’t know what to do with it. Editing was complex. Faced with a machine that could throw up text with minimal effort, the energy required to be reflective, analytical and metacognitive seemed too great. Many of these students ended up tweaking batch-delivered drafts. I realized that they weren’t ready for the advanced steps required.
The second group were the kids who usually found writing easier and received high marks. They almost universally closed the ChatGPT app on their devices. It was too different from how they usually wrote. I remember a few staring at their screens with faces twisting in disgust. Critically, may of them told me that AI’s revisions of their work robbed their writing of their voice. I watched them try all sorts of prompts, making changes they had the skills to identify they needed, but the outputs felt wrong to them. They wanted their essays to feel more like them – and they had the skills to get their drafts to that point. AI was a barrier.
AI-mediated writing had failed both groups.
Now, this was 2023. Models are far more advanced and students are more adept at prompting in 2025. Every class is different. But the lesson I took from that experience was that AI has powerful, spectacular potential to give students individualized instruction and training, and expose them to modes of the writing method they’ve been held back from, but it’s a tool that so easily devolves into cognitive offloading for some students and robotic intervention in human artistry in others. If AI is used to develop advanced skills, students need to be ready. It’s a tool that can overload the teacher’s work as much as the students’ necessary labour.
Language teachers need to find the right mix of humanity, interactiveness, scaffolded progression and assistance for AI to enhance learning, not just be a gimmick or a bypass.