Since the birth of the likes of ChatGPT, I've heard many teaching colleagues say that they worry about their students' cheating behaviour in their Writing homework. My solution to this is simple: whenever I meet a new group of learners, I always ask my students to do Writing in the classroom. This enables me to get to know their true ability to write in English, so that I can detect any anomaly later on. In fact, I did have an 18-year-old teenage learner who handed in an (almost) error-free essay in the middle of the year. He eventually admitted that his girlfriend had written it, and I decided to give him the benefit of the doubt on that single occasion.
What about mentioning the elephant in the room? That was what I did last Tuesday. I was teaching a small class of teenagers at Upper-Intermediate level (B2). The lesson was on writing a discursive, or agree-disagree, essay.
After the usual brainstorming, drafting and writing stages, I showed my students the same essay written by ChatGPT. My prompt to the AI programme was:
Can you please* write a 140- to 190-word essay at CEFR level B2? The essay title is 'Being successful in life depends more on a person's ability than on how hard a person tries. Do you agree?' The essay should include these ideas: studies, sport, and your own idea.
* Since ChatGPT is said to be a language model, I insisted on teaching it proper manners!
What my students and I did with the ChatGPT-generated essay was analyse its content. I guided my students on identifying the shortcomings of ChatGPT:
1. The word limit was not respected. ChatGPT gave us a 230-word essay.
2. Both introductory and concluding paragraphs lack coherence. The introduction begins with a sentence about the background topic of 'success'; however, this is followed by a mechanical sentence that goes along the lines of 'this essay will explore the following ideas: studies, sport, and ...' ChatGPT obviously repeated part of my prompt. The concluding paragraph is way off the mark - ChatGPT made no attempt to address the task statement or give a final opinion.
3. Some supporting examples aren't specific enough. For example, ChatGPT mentioned 'ability' when discussing sport but without exemplifying the specific kind of ability.