You need to enable JavaScript to run this app.
The Uncomfortable Climate Set by the Climate Survey
Anna Guerrini '25 Associate Editor
November 30, 2023

How much do you sleep, Deerfield, and do you feel like you have too much homework? Do you feel engaged in your classes, like you are learning something new and powerful? Do you feel like you belong, deeply, truly? 

These were some of the questions asked at the end of the 2022-2023 school year in the newly implemented Deerfield Climate Survey. I love the idea of this survey. Though I dedicate hours (and hours) writing for the Scroll, I am first and foremost a woman of math. I check my Google Sheets to see if I have enough money to get Doordash, I analyze polls to see what the current political approval ratings are, and I time my schedule down to five minute intervals. The idea of seeing the student experience quantified is, understandably, incredibly appealing to all the parties involved. 

As Associate Head of School for Student Life Amie Creagh said in an interview, “We care about the lives of Deerfield Academy students…We care about what you said enough to reflect it back to you.” Through the survey results, teachers can see what areas their students are struggling in academically, administrators can properly assess what campus guidelines are and aren’t working, and students can see firsthand what behaviors their companions actually engage in. If they can see those results, that is. 

I was quite upset upon receiving the results of the Climate Survey through a school meeting presentation. From my memory it took me 40 minutes to complete the Climate Survey, during which I was thoroughly interviewed on my drinking habits, mental health history, and the perceived usefulness of my academic coursework. Of these questions, the answers to only a select few were shared. Of this selection, even fewer were, in my opinion, properly contextualized. I believe that students should be able to form their own opinions, and should have the opportunity to review all the available information about an issue before having conclusions spoon-fed to them. 

Mrs. Creagh said that Climate Survey results were compared “to the Youth Risk Behavior Survey that is administered nationally. Some of the questions are the exact same, so it is comparing apples to apples.” So, Deerfield, when the presentation mentioned a “national average,” these were the exact numbers they referred to. According to the Youth Risk Behavior Survey, 22.2% of respondents in 2021 had seriously considered suicide. According to the same survey, 28.3% of students said their mental health was “most of the time or always not good.” While these numbers are unpleasant, I list them to highlight a point I would like to make: the data presented in the Climate Survey school meeting was true, but it was not properly contextualized. 

This is a true statement: “In the year 1984, sharks bit 14 people in the U.S. while New York City residents bit 1,589 people” (Florida Museum). This is a properly contextualized statement: “While New Yorkers bit almost 115 times more people than sharks in 1984, there were far more New Yorker vs. human confrontations than there were shark vs. human confrontations.” That past example is comedic and quite frankly absurd, but it shows how the contextualization and presentation of data is just as important as the data itself. During the school meeting presentation, administrators boasted that the Academy fell around or below the national average for certain mental health statistics. They gave neither the percentage of Academy students who struggled with mental health issues, nor the national percentage of students who met the same criteria. They did not mention that high schoolers are facing an unprecedented global mental health crisis. They failed to mention that the average is, quite frankly, abysmal. The presentation of this piece of data was incomplete and misleading, even if the statistics were factual.

The presentation took two pieces of data – to what degree students value grades over learning and to what degree they felt their classes were worthwhile – and connected them with a simple comparison of a few pie charts. However, data analysis is a web. How many of the “grades-oriented” students showed signs of poor mental health? Did “grades-oriented” students tend to get higher grades? Was there a sharp gender divide in the results, a surprisingly important distinction to make; a study (Mark Pelch, International Journal of STEM Education) found that women “report significantly higher levels of student anxiety than their male peers”? One piece of data was contextualized with one other singular piece of data. Once again, all the data in these few slides was true, but its presentation painted a biased and fragmented picture of the student experience. 

In my opinion, there is simply no way for curious students to receive a complete picture within the confines of a 45 minute slot in the middle of their Wednesday mornings. Instead, I would have appreciated seeing all the data released to the entire student body, perhaps as aggregated data rather than raw spreadsheets so that there would be no way for students to individually isolate certain responses. At the very least, I would have loved to have the presentation released to the student body for further consideration. The SLO declined to share that document with me for this article. This makes this issue hard to report on and even harder to discuss long term, seeing as notes and memory can quickly fade into hyperbole and oblivion. This would give students the ability to independently form conclusions, whether those conclusions be, “Oh, I had no idea that so few students actually drank on campus, and it makes me happy to know I am in the majority” or “Oh, I had no idea so many students feel like they are unhappy at Deerfield; I should think about what I do in my daily life to try and spread more kindness.” 

I believe that sharing data can have profound impacts on anyone who takes the time to process and interpret it, but especially on student leaders. I can only imagine how much change could occur if, say, alliance leaders learned to what degree affinity students felt like they belonged on campus. If leaders of the Gender and Consent Committee learned more about what populations (gender, sexuality, grade, etc.) don’t associate intimate behavior with the word “consent”, they could have created better informed campaigns to promote healthy relationships. However, I can only imagine these scenarios because data concerning these questions was not shared during school meeting. Evidently, it was not for the entire community. 

Deerfield has done an excellent job of promoting media, political, and scholarly literacy during my time here. I have attended countless classes in the library, learning how to distinguish between worthwhile and mediocre evidence. I have spent history classes learning how to engage with sources while recognizing their inherent limitations. I have attentively listened to two (soon to be three) Deerfield Forums, seeking to form my own opinions about the topic at hand while still actively engaging with those I disagree with. How can the Academy promote such engagement with AI generated armageddon, but passively discourage discussion about the student climate?

Deerfield, why did you even bother to answer any of the above questions? Sure, I’m sure many people only answered because they were practically forced to, but I firmly believe many others did because they care. Because they are curious about what the current student climate is, and because they want to improve it. We can’t even begin to do that through a select few statistics, taken largely out of broader context and used to push a highly specific narrative. We want the evidence, we want to draw the conclusions. We want to thoughtfully engage in the realities, the triumphs and shortcomings, of our Deerfield. We want to care. Let us. 

 

Kayleen Tang