The technology mufti assured us that artificial intelligence will make our jobs easier. They weren’t talking about teachers. Writing might be the most important skill that teachers try to impart to students; ChatGPT has made that much, much more difficult. Maybe impossible.
The answer, according to some, is to talk up writing and persuade students of its value. Whenever I read articles like this, I think: “Has the author ever met a kid before?” I spend a lot of time trying to convince students to do things that are arduous but worthwhile— pray, attend Mass, go to confession, etc. Usually, I fail. Writing is harder, more time consuming, drearier, and less valuable. The odds of persuading kids to do it are, I thought, Buster-Douglasian.
But recently I had a conversation in class that made me question that judgment. A little background:
My sister’s a teacher who works for a Catholic diocese. A few weeks ago, her bosses subjected her to a presentation about the glories of artificial intelligence. Reportedly, it was torture, with the low point coming when the presenter bragged about a school that used AI to compose a letter to their community, after one of their students committed suicide.
Repulsive. Contemptible. And instructive. It shows what’s wrong with AI.
Personal writing is about creating a bond between author and audience. That’s why it’s called communication: it builds communion. Joy, anxiety, curiosity, outrage, love, confusion, and grief can be distilled and shared. The process goes: 1) feel 2) think about your feelings and 3) put thought and feeling into words. But the value is not in the words— it’s in the thought and feeling. Words are just the medium. AI can’t think or feel, so the writing it does is fraudulent, or if you prefer, artificial.1
Here’s the good news: when I recounted that story to my students, they found it appalling. Most of them are infatuated with AI, but they understood how despicable it was to outsource mourning to a computer. Their reaction gave me hope. As long as kids recognize that some things should not be written by AI, maybe we can get them (or realistically, a handful of them) to see that they should learn to write.
Competence in any worthwhile skill takes years of practice— if you’ve never sewed, played poker, or swung a baseball bat, you’re going to do terribly at first. Writing is no different. After recounting stories like the one I told above, our pitch to students can be: “One day, you’ll need to write with love, empathy, authenticity, and compassion— let’s learn how.” If teachers sell it that way, and craft lessons and assignments around that idea, writing may stand a chance.
As an aside, in conversations about AI, no one ever seems to point out that using it is dishonest. I’ve never seen anyone attribute their work to ChatGPT, so a person who uses AI is misrepresenting himself or herself as the author. Shouldn’t that raise ethical concerns? And if AI becomes ingrained in our culture, which seems inevitable, should we be worried that it might inculcate other habits of soft deception in people?
I cannot believe that about the letter. Good for your students recognizing how reprehensible that is, and good for you for sharing the story-smart teaching