The Daily Pennsylvanian is a student-run nonprofit.

Please support us by disabling your ad blocker on our site.

01-30-23-chatgpt-benjamin-mcavoy-bickford
A computer screen at the front of a classroom displays ChatGPT, a recently developed chatbot from OpenAI. Credit: Benjamin McAvoy-Bickford

ChatGPT reached 100,000,000 users on Thursday in two months, which is a new world record that was completed almost five times faster than the previous fastest, TikTok. Its fast adoption and seemingly endless use cases have prompted stories saying it will destroy the world, save the world, and … make beer? However, among the millions of takes,  I couldn’t find anything about how the Ivy League plans to change its curriculum in light of AI’s abilities. That’s a problem.

Higher education needs to rethink curriculums to prepare students for the future they are already living in. Penn needs to stop teaching useless skills, and should instead introduce model-prompting in relevant coursework and embrace an AI-assisted education system.

Education systems are responding to AI in one of two ways. The first and loudest of these responses is the “Get Off My Lawn” Homer Simpson type. This group’s response has been to ban ChatGPT generated text with the argument that it eliminates the need for critical thinking skills, dilutes public discourse, and devalues human-centered experiences. 

The second is the counterculture, “Make Love, Not War” response, which stresses the value that ChatGPT can provide in the classroom, while minimizing its impact on academic integrity. Both of the Penn professors that I spoke to about AI fell into this category; it also has many public proponents.

The first school of thought is a dead-end; just as Socrates once argued against writing because it damaged memorization abilities, so too do AI’s naysayers fear human cognition will be damaged to the point of a Simpson-like existence. The second, while more open to the use of AI, is minimizing the long-term impacts of it on our society, and consequently, is not adequately responding to prepare the next generation for the future. 

Itamar Drechsler, a professor of finance at Wharton, said that ChatGPT “Makes not doing your work easier. Just like you could have asked your friend for their homework, you can now ask AI to do it with a similar or worse result.” He added, “ChatGPT is basically a really good calculator. We didn’t stop teaching addition because of the invention of the calculator, did we?” He makes a good point. Should we really change what’s being taught just because it can now be done with AI? 

I certainly think so. I don’t want to learn things that you can generate for free from a website, and I don’t think that the students at Penn that pay $85,000 a year want to learn skills that are already obsolete or will be in five years. AI is simply our generation’s calculator.

So why don’t all elementary schools stop teaching addition now that we have calculators? Addition is still a critical skill to have because it’s instrumental for doing difficult math that cannot be so easily performed by a calculator. Thus, Penn doesn’t need to stop teaching everything that can be done by AI; rather, the University could refrain from providing instruction on skills that are ends themselves and not means to acquire other skills.

I call these skills intrinsic skills, or skills that are not necessary to support future learning. An example is building a Discounted Cash Flow model, which helps analysts value a company but is not necessarily instrumental in future learning (Goldman Sachs, if you’re reading this, I’m sorry … please give me a summer internship).

Adjusting to the new market and society that AI has created will help Penn stay ahead of the curve in preparing its students for the future. Important skills for the 21st century will be less organized around execution (or being able to do things), and more about knowing what you want to do and being able to prompt a machine to do it for you. Therefore, I suggest adding a prompting chapter to every course at Penn that runs students through effective use of AI for content related to the class.

In the humanities, for example, learning how to use ChatGPT to write a more effective first draft of your essay would be really helpful in improving your writing speed and quality of your piece. Or, to get a bit more in the weeds, learning how to use the GPT-3 API to fine-tune your own model for a specific type of paper you are writing, can drastically improve outcomes.

Professor Chris Callison-Burch, an associate professor of Computer and Information Science who teaches artificial intelligence at Penn, agrees, and thinks AI should and can be used in all areas of the classroom. He has used AI to help him summarize his lectures and even come up with multiple choice questions about the material he covers in class. Callison-Burch thinks that prompt engineering, "is useful for people to understand how models behave and to have a basic familiarity with how AI can support the work you are doing.” 

Penn professors and administrators need to rethink curricula that can be displaced by artificial intelligence and replace it with instrumental skill teaching, prompting, or anything that could become more valuable in the context of artificial intelligence. 

Today, we lead the Ivy League and global education system in preparing students with marketable skills, and have never genuflected to a humanities curriculum the way many other elite institutions have. We have never hesitated to adopt new curricula that are critical to the markets our students have gone on to participate in. Similarly, we should not hesitate to integrate ChatGPT in a meaningful way. 

BRETT SEATON is a Wharton sophomore studying Finance from Manhattan, Kansas. His email is bseaton@wharton.upenn.edu.