With the rise of ChatGPT, SU professors navigate its role in the classroom
Emma Lee | Contributing Illustrator
Get the latest Syracuse news delivered right to your inbox.
Subscribe to our newsletter here.
At the start of the spring semester, Syracuse University’s Center for Teaching and Learning Excellence disseminated a resource document for professors on artificial intelligence in academics. While making their spring syllabi, some faculty said they were unsure about how to address the opportunities for cheating and plagiarism following the introduction of ChatGPT in November 2022.
ChatGPT, standing for Generative Pre-training Transformer, is a chatbot trained on 570GB of Internet data – including Wikipedia, webtexts and books – to produce written responses to user-submitted queries. The OpenAI-created bot is built from GPT 3.5, which Jing Lei, SU School of Education interim associate dean for academic affairs, referred to as a member of the same technology family.
Now, responding to newfound customized access to automatically-generated data, SU professors are navigating ways to use ChatGPT as a tool to enhance learning and as something that presents questions about academic integrity.
For retired SU professor and plagiarism expert Rebecca Howard, the tool’s potential in education outweighs any plagiarism or cheating concerns it might present. She pointed to previous technological advancements in the 1990s like the internet and Wikipedia, which, at the time, sparked similar panics over cheating and dishonesty.
“People were forbidding their students to get on the internet, which is just hilarious to think about,” Howard said. “Collectively we’re acting pretty much the same now as we collectively did with the introduction of the internet, and of Wikipedia, and that is fearfully. And we’re doing the same thing now. We will work it out and calm down. But right now everybody’s writing about how to catch people using ChatGPT.”
She explained the AI isn’t able to generate a piece of scholarship on its own in the first place. It uses patchwriting, which Howard said takes the language of an original piece, substitutes some synonyms and moves words around to generate a new text. She said the bot will say something that seems sensible, but the writing won’t be very good.
Chris Forster, a professor in the English department with a focus in digital humanities, used GPT 3.5 for assignments in his fall “Literature and its Media” course. He said that what people can learn about the data set that an AI is trained on is more valuable than the wall of text an AI can produce.
“Rather than it being a way to get to write your papers for you, or to provide an intellectual interlocutor, I think these things like the image generators or chat GPT give people a …way of interacting with these sorts of large data models that are also going to be shaping things that are a lot less obvious, right?” he said. “Companies are going to be using these for things … like determining credit ratings, trying to figure out investment risk.”
In learning about the data that bots pull from, Forster identified the value of disciplines in the humanities. He pointed to the process of interpreting the language produced by GPT 3.5, which he said looks like a close reading in an English Lit class.
“It’s trying to think really carefully about, ‘what is the structure of this article? How does it progress narratively? What are the sort of implications and connotations of some words?” Forster said. “Do I understand what sort of texts have shaped it?”
Arlo Stone | Contributing Designer
In her “Rhetorics and Robots” course at SU, Krista Kennedy — a professor in the writing and rhetoric department who studies rhetorics of technology and algorithmic rhetorics — said she’s interested in how much agency and control a human has when writing with a bot like ChatGPT. She said the next time she teaches a class like professional writing, she wants to assign students a format like a cover letter, and have them write one with ChatGPT and one on their own to compare.
In his literature course, Forster asked his students to do something similar when the class read “Dracula”— after crafting a prompt to feed to the AI, students turned in an entirely bot-written paper alongside a reflection. He said one of the most interesting conclusions from the assignment was how much power the way the prompt was crafted can have on the end result.
He said that in the class and in his own experimentation, giving ChatGPT a sophisticated prompt produced a more sophisticated response. He said students who just instructed it to “write an essay on Dracula” ended up with text that sounded like a high school essay on the book, adding that plagiarism hasn’t been a significant concern for him because the bot can’t complete insight of its own.
“In a field like English literature or film studies, to some strong degree our bread and butter is close analysis. If you spent any time playing with ChatGPT as it currently exists, it’s just not very good at that,” Forster said. “It’s very good at sounding smart. It’s just not very good at being smart.”
Howard said she’s actively using ChatGPT to help her write a book about teaching with the service. She said that rather than worrying about whether writing with AI is cheating or plagiarism, people in academia should be figuring out how to go about crediting a bot as a collaborator.
In collectively deciding how to ethically acknowledge having collaborated with a bot, finding ways that are not just based on 19th century ideas of plagiarism – but on what applies to the present day – is a more pressing concern than whether students are using patchwriting to produce simple text, Howard said.
“If students are using ChatGPT to answer assignments, and the instructor can’t tell that it’s not a student in their class who wrote it, there’s the problem,” Howard said. “The problem is in the assignment if we’re asking students to do stuff so basic that ChatGPT can do it instead. There’s where the real fear is, (where) everybody realizes they’ve got to change what they’re doing. You don’t want to have to completely teach differently, but that’s what we’re gonna have to do.”
Margaret Usdansky, the founding director for SU’s Center for Learning and Student Success, said SU doesn’t have plans for any significant adjustments of its academic integrity policy or evaluating student work made using ChatGPT.
Usdansky said because the university’s academic integrity policy can be made to apply to AI-written work, the current plan is suited to new technologies like ChatGPT.
“For now, the policy is broad enough and talks clearly enough about instructors’ ability to set course-specific expectations, and about the assumption that the work you turn in is your own, unless it’s clear from the assignment that it can draw on other sources,” Usdansky said. “And that isn’t going to change overnight because ChatGPT exists, right?”
It’s very good at sounding smart. It’s just not very good at being smart.Chris Forster, SU English professor
In its resource document on ChatGPT, SU did provide recommendations for professors concerned about cheating and plagiarism in their courses, including AI detection softwares like GPT2 Output Detector and GPTZero.
Among other recommendations, the document also includes alternatives to assignments that students could use ChatGPT to complete, like concept maps or other visualizations which replace traditional writing-focused assignments, including podcasts, videos, speeches, interviews, drawings, storyboards and performances.
Bei Yu, iSchool professor and director of SU’s Information Science and Technology Ph.D. program, said that as far as academic integrity goes, ChatGPT is one of many incoming technologies that will dramatically change the way people live their lives. She said that when thinking about the role AI should play in education, it’s important to consider the understanding of the technology as a skillset that students will need as they enter the workforce.
“At many different levels for a university, educational institution or for higher education as a profession in general, we need to be forward-thinking, we need to be proactive instead of reactive,” Yu said.
For Forster, the ultimate goal of incorporating ChatGPT into courses and exploring how the technology works is to expose students to it at the earliest possible moment, so they can be prepared to navigate it in their lives and careers. But when it comes to producing work that matters, he said he doesn’t see any utility.
He gave the example of modernist art, referring to the premise that because pieces in the genre sometimes look meaningless, the art has to convince the viewer to take it seriously. Student work should do the same.
“What I’m looking for when I read an essay is not just does it sound like a human, but is it a human?” Forster said. “With ChatGPT … people seem to ignore that as a reader of something, I care about what produced it and inspired it. Was there a person who actually had this thought and actually meant and intended it with conviction?”
Published on February 2, 2023 at 1:47 am
Contact Jana: jlseal@syr.edu | @JanaLoSeal