Emily Carbone ‘28
Features Editor
Artificial intelligence is becoming a tool used on the daily by millions. I doubt any reader of this article can say they’ve never touched it. Maybe you’ve used it to put together a grocery list, get advice, or generate a worksheet for an upcoming quiz. Either way, I am sure you’re at least somewhat aware of the scope of generative AI, as well as how it creeps into academic spaces. If you take a look at one of your syllabi, it likely contains something about your professor’s policies on AI usage in the classroom. Some uses of AI in academia are not necessarily dishonest (i.e, the previously mentioned example of creating worksheets), and others are so minor they may as well not count. Correcting grammar, summarizing articles, and organizing your ideas with the help of generative AI can’t be cheating. Right?
While the answer to that question certainly depends on your specific class and professor, I wanted to take a closer look at Holy Cross’ schoolwide policies on AI, and what the administration’s plans are as it continues to evolve. I spoke to Dean Klinghard, who serves as a co-chair of our Institutional Review of Artificial Intelligence Task Force, as well as our Dean of Education and Academic experience, to learn more. The Institutional Review of Artificial Intelligence Task Force meets regularly to reflect on the impact that AI has on every element of Holy Cross. “I think there’s been a lot of discomfort about the subject–students don’t want to talk about it, because they think they’ll be suspected of cheating,” Dean Klinghard explained. “Faculty don’t always know how to talk about it. But it’s important that we understand each other, that we set clear expectations about legitimate and illegitimate uses of technology.” One of the top priorities of the Task Force is to change the culture of speaking about AI at Holy Cross. Clear outlines of what is acceptable AI usage will allow for students to make the most out of all technological resources in academic settings without fearing disciplinary actions.
That being said, the possibility of exploiting these same resources complicates college policy. There is, of course, the issue of cheating using generative AI, which includes passing off AI-generated work as your own. Dean Klinghard, however, worries more about the broader problems that stem from using AI while engaging with coursework. The concept of cognitive offloading, or the delegation of intellectual tasks to AI, particularly worries him. “For instance, letting NotebookLM summarize an article for you that you don’t have time to read may enable you to stay engaged in a class during a busy time of the year, but if you do that for every reading, your mind isn’t really engaging with the content.” This, obviously, impacts what Holy Cross students are getting out of their education, and the skills that their coursework allows them to develop. “A huge part of what we expect to happen to students at Holy Cross is the cognitive development that happens when they read and experiment and struggle and engage with sources outside of the classroom. It is going to take an intentional effort to articulate why they should do this hard work, even though an easier solution is available.”
While the Task Force recognizes the importance of placing restrictions on how students use AI in academic settings, they still recognize its benefits. Holy Cross is working on building an AI math tutor for students to use, and there is already a platform that uses AI to transcribe lectures for students with auditory processing challenges. A quick look at the AI Academic Resource Center on Ignite showcases a variety of AI tools Holy Cross has already developed. Dean Klinghard additionally mentioned the benefits for non-native English speakers, calling AI “hugely transformative…to better understand difficult subjects. But these need to be developed into well-designed products, not just use cases for ChatGPT.” Indeed, Dean Klinghard also commented on different advice being given on how to incorporate AI in classroom settings, stating that most of them don’t actually explain how AI is more effective than traditional methods, and seem to be suggesting its use for no clear reason.
Looking towards the future, AI will clearly cause large changes in the academic world. Building off of the talk given by Alonda Nelson earlier last week, Dean Klinghard wants Holy Cross to place emphasis on traditional rules regarding academic dishonesty and classroom policy, rather than constantly revising or changing them. While AI technology will provide different tools for research and revision, you should not expect to see Holy Cross’ policies on it change anytime soon. What you should, however, expect, is a shift in conversations regarding AI. Dean Klinghard discussed the way scholars will be able to analyze larger data sets and documents using AI, but also pointed out how this will affect the broader academic community of Holy Cross. “It is going to have an impact on the way faculty design and assess class work. If this is done in a way that makes faculty feel like they are constantly trying to sniff out AI use, or make students jump through hoops to prove that they are not using it, it will undermine trust between students and faculty,” he explained. “But I am hopeful that it will give faculty an opportunity to re-think how we teach, to focus on teaching the things that AI cannot do–rigorous thinking, good judgment, and aesthetic sensibility.”
Featured image courtesy of Holy Cross Magazine
Copy edited by Sophia Mariani ’26

Leave a Reply