As artificial intelligence increasingly affects academic life, Penn students and professors shared their reactions to the range of AI policies across the University in interviews with The Daily Pennsylvanian.
Penn’s math and engineering departments in particular have already started to incorporate AI into their curricula. While multiple students and professors interviewed by the DP believe AI usage is inevitable in the classroom, others expressed a desire for more guidelines out of worry that an overreliance on chatbots may discourage students from seeking more substantive help from professors or teaching assistants.
Penn’s administration has recently taken steps to prepare for an academic future including AI.
In a past interview with the DP, Penn Provost John Jackson Jr. stated the administration should “think long term” about the ways “AI portends for the future of what we do.” Moreover, in October, the University entered a cooperative artificial intelligence agreement with the Commonwealth of Pennsylvania seeking to develop guidelines for productive use of AI in education.
Robert Ghrist, a professor in the Mathematics and Electrical and Systems Engineering departments and the associate dean for undergraduate education for the School of Engineering and Applied Science, emphasized a similar sentiment.
“The job that we have at Penn Engineering is to train the next generation of inventors, of people who are going to bring new tech to life, and translate that tech into tools that make the world a better place,” Ghrist said to the DP. “Anytime something as revolutionary and potentially world-changing as AI comes out, this has to be part of what our engineering students learn how to use. Everybody's got to be conversant with it and able to deploy if needed.”
Students interviewed emphasized the value of incorporating AI into their daily lives.
“AI will definitely become more prominent and useful within any technical field,” Engineering first year Vivian Chen said to the DP. “Learning how to develop a healthy relationship with AI early on is important.”
Penn GSE launches high school curriculum on identifying question bias in AI
Penn professor co-authors National Academy of Medicine AI code of conduct
Ghrist has developed a chatbot for MATH 1410 that integrates with ChatGPT and is trained on the specific curriculum and notation of the course.
“Everything that I’m doing to build this problem generator, I’m showing my students how to do,” he said. “I’m making available to them the same resource that I’m using, and I’m encouraging them to just spam out as many problems as possible.”
Engineering first year Gonzalo Vela, who is taking MATH 1410 this semester, noted the chatbot’s usefulness, stating that he has turned to it for questions before a quiz and “it gets the answer right away.”
Chen also felt that engineering classes “encourage using AI, but not as a sole source of information.” In this context, AI can “help you organize your thinking … and [is] a good way to get a broad overview of the topic.”
Ghrist uses AI to help generate exam questions as well.
“This engine gives me good ideas for multiple choice problems,” Ghrist said. “I have to edit these very carefully, but [with] the core ideas, I’m using AI to generate it.”
Before each exam, he also has the chatbot evaluate the test itself.
“It gives me a report, saying that ‘I see you have covered these topics. You left off these topics. You have this many easy problems, this many moderate problems, this many really challenging problems,’” Ghrist said.
Ghrist shares the report with students after removing test answers, which he believes has “turned the stress style down a little bit.”
Incorporating AI also allows testing for different skills, Ghrist noted.
“No more writing out all the math details, [with] me checking your algebra and dinging you because you used the quadratic formula wrong,” he added. “This is higher-order conceptual thinking that I'm testing now.”
Chen noted that AI policies have become more lenient in general over the course of the fall semester, even in her humanities classes.
College first year Andreana Lee agreed, adding that in her creative writing class, she is able to use AI if it is “in a way that is creative and does anything new.”
Still, other classes — such as some computer science courses — prohibit AI use.
“If you just use [AI] for your code … you won’t ever learn anything,” Chen added. “A lot of coding is just problem solving and doing it by yourself, and that can only come through practice.”
Multiple students interviewed also acknowledged the limitations of using AI, citing concern over becoming overly dependent.
Vela expressed reservations, adding that AI “discourages us [from going] to office hours … because we can just ask the chatbot, so you don’t get that connection with the teacher.”
Students came to different conclusions on how Penn should address AI in classrooms.
Vela suggested that a better way to use it would be to “try to do [a problem] on your own and then check it with AI.” He also recommended that the University adopt more stringent rules.
“I’m sure students will still find a way to use it, but at least enforcing the rules a little bit more would be helpful,” he said. “Honestly, I don’t think they should really have [the chatbot].”
Ghrist also emphasized that students still need to be trusted to use the tool in a helpful manner.
“A core belief of mine is that we treat our students like the adults that they are and are becoming,” Ghrist said. “Removing the temptation altogether is a level of control that I am not so comfortable with.”
He further explained that professors must design assessments that acknowledge the reality of AI while still making learning the most rewarding choice.
“From a practical point of view, I think the onus is on the professor to make their course incentivize learning,” Ghrist said. “Anyone who is giving their students an essay as an assignment or giving take-home exams, they are not doing their students any favors.”
Looking to the future, multiple students interviewed expressed that AI use is inevitable, a sentiment Ghrist agreed with.
“Rather than say, ‘Wow, this is really difficult, we’re just going to ban it until we figure things out,’ I think it is much better to experiment — even if some of those experiments don’t turn out great,” Ghrist said.






