Skip navigation
News

From ChatGPT bans to task forces, universities are rethinking their approach to academic misconduct

Postsecondary institutions are playing catch up as generative AI and other emerging technologies make their way into student coursework.

BY DIANE PETERS | MAY 10 2023

Cheating has changed. It’s on the rise as learning takes place in an increasingly digital environment that includes remote exam proctoring and artificial intelligence software that can write essays, understand biology and create visual art and music. Instructors are pondering how to assess students in this new climate while universities are altering how they frame and enforce misconduct.

“There’s a broad need for conversations about shifting to a culture of academic integrity” in undergraduate teaching and learning, said Nancy Turner, senior director of teaching and learning enhancements at the University of Saskatchewan, where she’s part of a task force on academic integrity. Western University has also struck a task force on the issue while the University of British Columbia recently updated its regulations and communications on academic misconduct following recommendations from various working groups.

Universities doubling down on efforts to curb academic misconduct are driven by a steady rise in incidents over the past several years, with a spike beginning in spring 2020 as teaching and learning moved almost entirely online. At the University of Manitoba, for instance, cases went from 891 in 2018–19 to 1,303 in 2019–20, and ticked downward to 1,219 in 2020–21. At the University of Toronto, offenses soared in 2019–20 to 2,203 (up from roughly 1,600 offenses in 2018-19), and went up again the following year, with a total of 3,901 cases.

For 2022–23, Western recorded 430 offenses, a 12 per cent decline — the first time its numbers had gone down in five years. Though she didn’t have exact numbers, Dr. Turner suspects U of Saskatchewan will track flat numbers for the past few years. “We’re seeing the number of incidents level out. But we’re continuing to see challenges with academic misconduct.”

Proctoring software gone wrong

In 2020, universities tried to maintain integrity in their assessments during the pandemic pivot with new remote proctoring software. However, this added problems of its own into the misconduct battle. “The companies that offered these services, it was almost like they were pedalling snake oil,” said Sarah Elaine Eaton, associate professor at the University of Calgary’s Werklund School of Education, who noted that companies used high-pressure tactics during a challenging period for decision-makers, and made promises their products never delivered.

Dr. Eaton said the programs introduced unanticipated and troubling issues that instructors, invigilators and test-takers had to address on the fly. She recalled, for example, that some dark-skinned exam-takers had to shine a light in their faces to be recognized by the proctoring software, which was not trained to register all skin tones. In the fall of 2020, the program Proctortrack experienced a security breach that affected students at Western, and medical students taking qualifying exams that same year reported a litany of problems, including getting kicked out of the online platform, administered by Prometric, for no reason.

Student-led groups, including those at Western and Waterloo spoke up, and either triggered changes to the use of proctoring software or got it banned. “The balance has shifted back in favour of in-person assessment,” said Simon Bates, vice-provost and associate vice-president, teaching and learning at UBC. Many professional or nation-wide exams are still being offered online entirely, or as an option. Some, like medicine’s qualifying exam, have been updated because of student complaints.

Enter generative AI

While technology can provide tools to confront academic misconduct, it can feel like an unwinnable race, with the rapid rise of generative AI programs, the most prominent being ChatGPT. “It’s like autosuggest on steroids,” said Luke Stark, assistant professor in the faculty of information and media studies at Western. “It’s able to produce legible, plausible text, and a lot of it. But it doesn’t know what a nice sentence sounds like. It doesn’t know the meaning of anything.”

Ceceilia Parnther, assistant professor of higher education leadership at St. John’s University in Queens, New York, said university administrators and faculty members should not be surprised by generative AI – it’s hardly new: “I think we’re reacting to the reality that it’s now accessible for everyone. But it’s been around a long time.” Microsoft Office, Google Docs and Grammarly use AI to check spelling and grammar and suggest phrasing. Google Translate has been confounding language teachers for years, as have AI programs that can create presentations, video and other digital content.

A university in France has banned ChatGPT, as have numerous K-12 schools in the U.S. At least one university in Canada, the Université de Montréal, has forbidden students from using it in exams and assignments unless approved by the instructor, and many others have published guidelines and FAQs for its use. “We try to put limitations on it so we feel in control,” said Dr. Parnther. She said that instead, academics should familiarize themselves with AI programs impacting their field. “I suggest we think about what tech means in our classrooms and talk about how the tech helps or harms in different ways,” she said. These programs are becoming monetized, so they’ll increasingly benefit those who can pay.

These programs have potential as teaching tools or learning aids. “It could be a great supportive tool for reluctant writers,” said Dr. Eaton. Dr. Turner says instructors could integrate it into coursework, such as generating a short essay in class from AI and then having students pick it apart.

Instructors are seeing students use generative AI to cheat, with one U.S. survey showing 48 per cent of students had used it for an at-home text or quiz and 53 per cent to write an essay. Dr. Stark worries that students could covertly use AI to generate an essay and then edit it to be more accurate and in their own voice. “The whole point [of a writing assignment] is starting from scratch,” he said, noting that technology used this way disrupts the learning process.

Dr. Eaton said first- and second-year projects that don’t require higher-order critical thinking might be at the highest risk, especially in high-pressure courses. “If they need to do it to keep up, or [because they see] others are doing it, it’s more likely students will undertake misconduct,” she said. Meanwhile, essay mills may be able to produce content for sale faster, said Dr. Bates. However, since students can use AI on their own and often for free, these outfits could find their business suffers. This after cheating companies cashed in during the pandemic with discounted essays and offers to sit remote exams for students.

A holistic approach to academic integrity

To nab those using AI to cheat, instructors are turning to tools such as GPTZero, created in Toronto by a Princeton computer science student. Dr. Eaton said such programs may be helpful, but people are just chasing the problem. “The pace at which things are changing, it’s not going to slow down,” she said. “The next tool is just around the corner.” ChatGPT-4, Microsoft Bing and Google’s Bard will likely be followed by more big-name and niche products.

Instructors are considering more oral exams, handwritten essays and in-class projects as workarounds. “What we’ve learned, or should have learned by now, is that it’s inappropriate for teachers to assess their students the same way they were assessed,” said Dr. Eaton.

To sort out what to change, Dr. Parnther said instructors should look to their learning objectives. However, assessment tweaks will be challenging in large, lecture-based classes. “Not everyone has a 20-student class,” noted Dr. Stark.

For transgressors, schools are looking at different approaches to punishment. “We’d like to look at processes where the student would come to understand issues around misconduct and restore the thing that was lost, which most often was their learning,” said Dr. Turner of her hope that U of Saskatchewan will move to a restorative justice approach to address cheating.

That’s already happening at UBC, which now has a so-called diversionary process that allows some students who have admitted to misconduct to develop an integrity plan. “It’s a teachable moment in terms of helping them understand the expectations of a scholarly community,” said Dr. Bates.

Indeed, prevention and a deeper embrace of equity issues may be the best way for universities to fend off misconduct, regardless of the latest software. Said Dr. Bates: “These technologies are going to be part of the fabric of our digital lives for a long time to come.”

COMMENTS
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

  1. Professor / May 11, 2023 at 08:38

    There is only one thing that will stop students cheating: Kick them OUT if they get caught. It was fear of expulsion that stopped most of us cheating back in the days I was in school. We are far too lenient on students today. They are adults and should live with the consequences of their poor decisions.
    They know they can cheat because their friends cheat and get away with it. They cheat and get away with it. If they get caught, they just keep cheating because there are no real consequences.

    • Prof / May 11, 2023 at 14:50

      Thank you! Finally someone is saying it, accountability. We need to teach critical thinking not memorization. Part of the learning process is also understanding there are consequences to every action; including our our actions.

  2. John / May 11, 2023 at 18:58

    I do think we need to acknowledge that the vast majority of academic integrity issues are dealt with quietly. The amount of paperwork required for an offense is not worth the time for the instructor. Particularly in a time when sessionals make up a large percentage of of instruction.

  3. Helen / May 12, 2023 at 20:44

    If we want to address academic integrity, we need lead by example. Academics are some of the most vicious, politicised vipers in existence. You’re not going to get integrity when your entire industry is lead by a willingness to undermine your own integrity in order to advance your career. How many of you have given accommodations students were ineligible for because it was easier than dealing with the tantrumming student with poor time management skills? Those time management skills aren’t going to develop themselves. How many of you went along with coerced medical experimentation for a virus with a 99.7% survival rate, despite familiarity with TCPS-2 policy statement? Get real, folks. If you’re going to ride around on a high horse, you might want to first look in the mirror to ensure you’re not the donkey.

  4. Qusay Mahmoud / June 5, 2023 at 10:17

    I taught a graduate course in Winter 2023, designed the test to be take-home and made it a requirement to use ChatGPT. Students had to validate ChatGPT’s responses (not all are correct), write the correct answer in their own words, and upload a copy of their ChatGPT session. It may not be possible for all disciplines, more details at: https://drqwrites.medium.com/ten-ways-to-integrate-chatgpt-into-software-engineering-education-f916cd4b6575 and https://drqwrites.medium.com/ethical-and-responsible-use-of-generative-ai-in-scholarly-research-a96b7e3cf4f

Click to fill out a quick survey