By Isabella Nguyen and Lilac Nguyen, Banner Staff
It’s 11:00 pm on a Sunday night, and a Winsor student sits at their desk, staring at a blinking cursor. The essay prompt is open, but so is ChatGPT. Within seconds, they have a detailed outline, a refined thesis, and an impeccable conclusion. Efficient? Definitely. Authentic? Definitely not. As they read the words on their screen, a daunting question lingers: am I really even learning?
Generative artificial intelligence (AI) has quickly become a part of academic rhythms in many institutions, reshaping how students study, write, and even think. While AI can serve as a lifeline for clarity and speed, it has also redefined what it means to “do the work” and, perhaps more subtly, what it means to learn.
For many students, AI tools feel like a supportive guide. For instance, Leila Pan ’28 explained that she occasionally uses it “for studying or as a resource if [she] need[s] a concept explained in clear terms,… [as] its information is usually pretty accurate.” And the appeal is clear: information delivered instantly, explained clearly, and tailored to individual needs. Studying becomes faster, easier, and on the surface, more effective.
But researchers are beginning to question the tradeoff. A recent MIT Media Lab study found that participants who relied on AI for writing tasks showed lower brain engagement and decreased originality over time, compared with those who worked independently. The study suggests that while AI has the potential to make learning more efficient and can serve as a teaching device, it can also weaken the deep cognitive effort that turns information into true understanding.
Teachers at Winsor are trying to balance those possibilities with caution. History Teacher Ms. Lieberman noted that the department strives for “transparency and clarity in its approach to AI,” helping students understand “when AI is appropriate to use and when it serves as an impediment to learning important skills.”
She explained that the department “recognize[s] that there are ways that AI can support student learning” and is working “to incorporate it within the curriculum” where appropriate. Some tools, she shared, like Notebook LM—which turns readings into short podcasts—can help auditory learners engage more deeply with material. Still, she emphasized, “Some of the work just cannot be replaced by AI. We want students to develop the habits of critical reading and independent thinking.”
As Winsor students and teachers alike adapt to this new landscape, the challenge is no longer whether to use AI but how to use it without losing the human element that defines learning. Because when the cursor blinks and the screen glows late at night, the real question isn’t whether the essay gets done or not–it’s whose voice we are going to trust: AI’s, or our own.