If machines do the writing, students will stop doing the thinking

AI is close to being able to write students’ essays for them. But that will not help them understand why they think what they think, says Jane Rosenzweig

November 23, 2022
Robot's hand holding a pencil to illustrateIf machines do the writing, students won’t do the thinking
Source: Getty

As a college writing teacher, I’ve been joking with colleagues for years about what will happen when artificial intelligence becomes competent enough to write student papers – and to grade them (the rest of us will head for the beach!). The idea that language-generating AI was going to change the way people write was a distant concern, until it wasn’t.

Since OpenAI made its language-generating model, GPT-3, available to the public last year, I’ve been experimenting with it to see how close it can come to writing as well as my students. The interface is as easy as ordering coffee: you type a request in a text box and your text is generated in a few seconds. When I typed “write a paragraph about what college students lose if they let AI write their papers”, I had my paragraph in less than 10 seconds, for less than 10 cents:

“If college students let AI write their papers, they may miss out on important writing skills,” it read. “AI cannot yet replicate the human mind, so it may not be able to capture the nuances and subtleties of writing. In addition, AI papers may not be able to accurately represent the student’s voice or style. As a result, students may end up with a paper that is less than ideal.”

Indeed, this answer is neither nuanced nor subtle, but it won’t be long before our students are able to create convincing papers in a few minutes with minimal effort – ones that do appear to be the product of critical thinking.

Schools will no doubt add turning in machine-generated papers to the list of what counts as academic misconduct, along with plagiarism and buying papers on the internet. But I’m more interested in how we’ll convince students that just because you can outsource your writing to a machine doesn’t mean you should.

Sometimes my students see writing a paper as a hoop they are being asked to jump through so I can evaluate them. In other words, they see writing solely as a product. If that were so, there might be good reason to turn to GPT-3. However, as writing instructors tell their students, the writing process matters because writing is a form of thinking; we figure out what we think when we try to write about it. More often than not, when I read a student paper draft, I’ll find the most interesting and important point in the conclusion; the student had to write the rest of those paragraphs to figure out that point.

Writing – in the classroom, in your journal, in a memo at work – is a way of bringing order to our thinking or of breaking apart that order as we challenge our ideas. If a machine is doing the writing, then we are not doing the thinking. It only took me a few minutes of experimenting with GPT-3 before I was able to generate introductory paragraphs that mimic those my students might draft, on their own, today. When I asked the software to conjure up a thesis statement that contained an objection to an argument in Michael Sandel’s book The Case Against Perfection, it gave me this:

“One potential benefit of genetic engineering is that it could create a more unified and diverse community. By allowing individuals to choose their own physical and mental traits, genetic engineering could lead to a world in which people are not judged by their appearance or abilities. This would create a more tolerant and inclusive society.”

If one of my students drafted this paragraph, I would ask them why they think that no one would judge others for the traits they had selected and why people would opt for a diverse range of traits rather than choosing to look like movie stars. The student might concede that being able to choose our traits wouldn’t necessarily lead to a less judgemental society. Or they might argue that since there is so much societal pressure to look a certain way, it would be more equitable if everyone could look that way. Either way, they would have developed a clearer and more nuanced position on the topic. As I tell my students, there’s no point in writing a paper unless writing it helps you understand why you think what you think.

There are many ominous science fiction stories about what might happen if we are defeated by our own machines. But the evidence suggests that the bigger peril is to outsource too many processes to them. Perhaps the most worrying outcome of outsourcing writing is that we will lose our commitment to the idea that we ought to believe what we write – and, by extension, what we say. That commitment is already under threat from disinformation campaigns and the speed at which social media moves. Each semester, I tell my students about the magazine editor who, upon learning that I had not checked a fact in an article I was working on, said to me, “If you’re going to put your name on something, don’t you want to know that it’s true?”

Would it matter if we stopped believing what we write? I asked GPT-3. “No it does not matter if we believe what we write,” it replied [sic].

We’ve reached the point where we can’t easily distinguish machine writing from human writing, but we shouldn’t lose sight of the huge difference between them.

Jane Rosenzweig is director of Harvard College’s Writing Center, Harvard University. This is an edited version of an article that first appeared in the Boston Globe.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please Login or Register to read this article.

Related articles

Artificial intelligence will soon be able to research and write essays as well as humans can. So will genuine education be swept away by a tidal wave of cheating – or is AI just another technical aid that teaching and assessment will evolve to take account of? John Ross reports

8 July

Reader's comments (3)

As a historian of literacy, I remained unconvinced about any of these undeveloped semi-arguments. What, specifically, is the author's point?
The point is, as I understand it: if you only want writing to be a meaningless performance which doesn't reflect the thoughts of its author, then outsourcing it to AI may be adequate. Otherwise, you'd better ask for a human author.
Students stopped thinking with the invention of Powerpoint.