You need to enable JavaScript to run this app.
What Counts as AI?
Kevin Yang '25 Staff Writer
November 5, 2023

As part of the updates to the Student Handbook, Deerfield Academy enacted a new set of policies regarding the use of generative artificial intelligence (AI). The new policy is the Academic Affairs Office’s comprehensive response against the emergence of more accessible artificial intelligence technologies that began to bypass learning late last year. The new guidelines build on the policy set in February 2023, just a few months after the release of ChatGPT. The update aims to provide clarity and uphold the Academy’s goal of learning arose from profound considerations by the Office of Academic Affairs. 

According to the Students Handbook, “As of Fall Term 2023, students are discouraged from using AI-generative tools for student work unless under direct instruction from faculty to do so.” If a student is explicitly allowed to use AI, students “must cite this and include all prompts they used in their process or submitted product.” 

Dean of Academic Affairs Anne Bruder said that wants to prevent students from undermining their learning with this policy. “What is learning for you? AI could imperil what we believe learning to be at its core,” Dr. Bruder said. Instead of learning through “a process of trial, error, feedback, trial, error, feedback, students could shortchange that.” Dr. Bruder’s primary concern is students “bypassing the thinking,” a crucial part of learning, with the new technologies. 

In general, students also provided positive feedback on the new policy. Eric Li ’24 said he “agree[s] with the approach” and described the policy as “clear.” Li views the ban as a necessary step in the current situation: “Besides, they leave the opening for teachers to allow AI use,” he said. Marco Feng ’26 also supports this policy, saying that “in a democracy, if people cease to think, we have committed individual and political suicide.” Feng also pointed out that the school has taken a “deliberate choice of action and I think that is appropriate,” he said. 

Dr. Bruder said that at the end of the 2022-2023 school year, “some students [were] using [AI] in ways that were in conflict with our policy,” which further concreted Dr. Bruder’s belief of the need for a policy. Throughout the summer, the Academic Affairs Office worked to develop guidelines and communicate and educate faculty about AI. Persuaded by scholars noting the special situation of schools this year, Dr. Bruder said: “You cannot start school business as usual this year. [AI] has shaken up the educational world. You have to come out with total clarity to guide institutions through this sea change.” 

Along with changes to the Student Handbook, the Academic Affairs Office has been working with teachers to conduct more discussions on the meaning of “learning.” Dr. Bruder said in the school’s adapted approach, teachers “would simply start talking to students about what learning means.” The Academic Affairs Office also made guides to help faculty navigate the world of AI. Specifically, one guide included three different approaches to designing an assignment: one approach against AI, one approach “kind of curious about AI,” as Dr. Bruder said, and one taking full advantage of AI. This guide allows the faculty to evaluate approaches and think about AI if they want to. “We are letting faculty make those decisions,” Dr. Bruder said. 

Although the school tries to be clear and deliberate, there can be gray areas with certain technologies. Often, there is not a clear line between what is generative AI and not. Technologies such as autocorrect, Grammarly, and Wolfram Alpha may or may not be allowed based on the academic discipline. Despite potential inconsistencies, Feng believes there is little point in getting yourself caught in the details. Feng said that “when in doubt,” there is always the option to “just ask your teacher.” 

Li shared a similar opinion on the muddiness of where to draw the line of generative AI. For example, academic integrity questions could arise with an autocorrect program that can correct spelling and grammar mistakes, suggest synonyms, rewrite sentence structures, or generate entire sentences and paragraphs. “I can definitely see how that would be blurry,” Li said. “the school is basically… just putting the axe down on AI. There’s this kind of fear that AI is going to take over our minds and reduce our creativity. We will learn to navigate that.” Ultimately, Li recognized that it takes a lot of work to answer specific questions and refine the line between what’s allowed and what’s not. 

Dr. Bruder shared a similar view. She expressed that the Academic Affairs Office is more focused on technologies that students can just “enter an assignment and with zero thought, click a button and get an answer,” she said. “That was, for us, the real defining quality of generative AI— the content creation.” However, she did acknowledge the rapidly developing landscape and potential changes to these policies. “I and many of my colleagues have been in education for decades, and we’ve never witnessed something moving this quickly,” she said. 

Dr. Bruder compared the emergence of AI to the invention of the internet, which also provoked a radical change in the world of education to adapt to the availability of data and resources online. “The internet emerged in my final years of college and then was sort of a nascent thing as I began teaching,” she said. Even though the rise of generative AI might seem similar to the rise of the internet, what a lot of people forget is that “that was actually a much slower transition. That whole process was really a decade or decades-long movement” starting from “a quite rudimentary web that didn’t actually do that much, and very slowly content was added,” according to Dr. Bruder. It absolutely changed education, Dr. Bruder said, “but there was time to sort of process and think about those changes as they were unfolding. This, on the other hand, felt very much like flipping a switch.”

Max Peh

Generative artificial intelligence is “a quickly unfolding story,” Dr. Bruder concluded. As the situation develops, she continues to work on discussing and educating the community about AI and staying “ahead of the game compared to many of our peer institutions.” She realizes the need to be “very thoughtful about policies” and that the Academic Affairs Office “has a lot of thinking to do,” she said. 

Even though the new policy in the Students Handbook may seem like a simple complete ban, the Academic Affairs Office spent lots of time behind the scenes in developing policies and discussing with faculty, and despite the rapidly developing and intricate situation, the Academy is always looking to reevaluate and improve its policies.