My friends and I sometimes find ourselves reminiscing about a time before ChatGPT. We don’t mean it out of sanctimony, but rather personal disappointment.
Here at Penn, long gone seem the days where one sits down and writes a paper without help from Claude or Google NotebookLM. Completing class assignments feels less like an intellectual exercise and more like a choreographed dance between different AI platforms. After years of slaving away over high school essays, have we been reduced to mere prompt engineers?
Maybe the simplest solution is to blame the AI oligarchs and us brainless zombies who use their platforms. If we all just gathered the resolve to avoid ChatGPT, perhaps we could collectively recover our brain matter. But what if we instead considered how Penn’s culture surrounding learning and its structure of classes might be encouraging students to take shortcuts?
Culturally, AI exacerbates our predisposition to preprofessionalism. Because most of us enter college with a hyperintensive focus on our planned major and career path, we push seemingly irrelevant classes to the periphery. The pace and volume at which we work makes it especially tempting to let AI handle the fluff.
And the “fluff” in question is usually general education requirements, which are a prime case study of AI-induced apathy. Not always but often, classes meant to fulfill one of the College’s plethora of sectors and foundations appear unenticing and laborious. In turn, the most coveted classes aren’t those with a reputation of deep engagement or thought-provoking professors. Instead we fight over the courses that double count for gen-eds and hold the lowest difficulty rating on Penn Course Review.
In these classes, the writing is on the wall. On the first day, professors and students almost immediately enter a semi-spoken agreement: the professor knows that the students don’t want to be there and the students know that the professor wants to be rated well. In exchange for an easy A, students give the class a high “professor quality” rating and a low difficulty rating. The next semester, new students are drawn to enroll and the cycle continues. Everyone is happy.
But the diplomatic, mutually beneficial contract of these classes makes them functionally pointless. Ultimately, nobody gains anything from an arrangement where a professor lectures students who have no intention of learning. The big picture — the point — gets lost. It should be no surprise, then, when even a minor assignment for this kind of class feels enormously cumbersome. And who better to employ to complete seemingly meaningless work than ChatGPT?
It’s not the content of gen-eds that makes them functionally worthless, but rather the culture surrounding them and the attitude of all parties involved. In this way, AI is perhaps not the cause of anti-intellectualism, but rather a symptom. It has unmasked the culture of intellectual apathy that underpins so many classes. I agree that AI has made us dumber, but banning or restricting it won’t suddenly make students care about their gen-eds, or inspire them to pursue the liberal arts education that the College so vehemently advertises.
Changes to the nature of gen-eds, however, can functionally suffocate AI. We must foster a culture of enrollment where classes are popular for their application to real life and their ability to induce intellectual excitement, not for their amount of work or their ease. First-year seminars, for example, are particularly good at generating buzz for their quality. I’ve encountered very few people who didn’t rave about their first-year seminars (and that’s not for a lack of A’s available in those classes). Imagine if courses across the University generated press among students in the same way.
The College seems willing to meet the challenge. Two weeks ago, the College Committee Chairs announced a new vision for gen-eds which loosens restrictions on course attributes and accommodates greater exploration. But this restructuring isn’t enough to eliminate apathy towards gen-eds if it doesn’t also include changes at the level of instruction and individual courses. In a class whose exigence and relevance you recognize, it becomes intrinsically difficult and almost shameful to use AI. That relevance is almost always there, but it must be underscored in the classroom.
AI won’t die simply by making gen-ed classes more rigorous. But this unspoken contract that exchanges A’s for ratings must be ripped up. When curiosity and real-world reasoning enter the classroom, AI will exit quietly.
DEW UDAGEDARA is a College first year studying neuroscience from Long Beach, Calif. His email is dewdunu@sas.upenn.edu.






