Faculty Forum: Learning with ChatGPT
By Russell Chun
In less than a semester’s time, the artificial intelligence (AI) chatbot known as ChatGPT has astounded the public with its remarkable ability to generate sophisticated and humanlike text. Introduced by the research lab OpenAI in November 2022, ChatGPT has since been treated like an alien creature, subjected to a barrage of tests by a curious public poking and prodding to explore its abilities and find its limits. Its wide-ranging output—including prose, code, poetry, screenplays, song lyrics, and jokes—has been impressive, even amid noted concerns about quality, accuracy, and bias.
In higher education, ChatGPT has raised the alarm for its impact in the classroom, with many educators scrambling to understand how to handle such a powerful tool that makes plagiarism easy. The uncertainty and hand-wringing anxiety have created panic, with some already bemoaning the end of the college essay or advocating measures such as bringing back handwritten assignments to prevent cheating. Others have suggested technological defenses. ChatGPT could, in theory, embed a cryptographic watermark in its output so that teachers could detect whether an assignment was created by the student or by AI.
While it is tempting to protect our way of teaching and giving assessments, as educators, we need to resist approaches that reinforce an adversarial relationship with our students. Too often higher education pits students, who are seen as potential cheaters, against faculty members, who are the vigilant enforcers of rules. This mentality has reached near absurd levels in recent years, with students taking remote exams through browser lockdown technology, sometimes coupled with live webcam surveillance to detect any suspicious activity and to track eye movement.
Aside from the counterproductive culture that this antagonism fosters, such solutions only add to the burdensome layers of technology and fuel the arms race of cheating hacks and anticheating mitigation. With the release of the even more powerful ChatGPT-4, the task of keeping up with technical countermeasures looks even more Sisyphean.
It has been said that ChatGPT is to the humanities what the calculator was to mathematics. These new tools are not something to fear or to stifle. They should transform the ways we teach as we incorporate them into our pedagogy. We should embrace the new technology because of its critical importance in our students’ future professional lives. AI tools are already in use in journalism, generating formulaic, data-heavy stories in business or sports to free up humans for more creative and higher-level endeavors. As a journalism professor, my responsibility is to help students navigate the ethics, features, and limitations of these communication tools. We need to understand, or at least acknowledge, the underlying processes guiding AI text generators. In that regard, ChatGPT is valuable as a lesson in digital literacy as much as it is for its output.
We are engaged in a learning process with our students, and writing is ultimately a process and a “form of thinking.” Focusing too much on a finished product only reinforces a purely transactional relationship with students, where they provide an assignment in return for a grade. As much as I want to see a polished product, it could be far more useful for me to ask my students to “show your work,” as elementary school math teachers require. How would they do that? One way might be to ask students to use Google docs and provide access to view the version history, which records all of a student’s starts, stops, rewording, and rethinking. Google’s version history is much like Microsoft Word’s track changes but allows you to see how a single user has edited a document over time.
Other ways for students to show their work can be quite simple—breaking down the process so students reveal their research notes, initial outlines, references, or any other relevant preparatory material. In doing so, we expose methodology that we have come to expect to see in other academic fields. We know from popular culture that going behind the scenes for any creative project can be as fascinating as experiencing the final product itself. In the end, ChatGPT forces us to unpack the practice of writing and thinking, which serves as both an instructive pedagogical approach and an effective deterrent to plagiarism.
Russell Chun is associate professor in the Department of Journalism, Media Studies, and Public Relations at Hofstra University’s Lawrence Herbert School of Communication. Academe accepts submissions to this column. Write to firstname.lastname@example.org for guidelines. The opinions expressed in Faculty Forum are those of the author and do not necessarily represent the policies of the AAUP.