The Daily Pennsylvanian is a student-run nonprofit.

Please support us by disabling your ad blocker on our site.

The Penn Course Review's Will Lavery and Andrew Lee perch atop thousands of student surveys used to determine the secrets of Penn's courses. [Phil Leff/The Daily Pennsylvanian]

Through the narrow paths between walls of textbooks and looseleaf paper, around the piles and bins of pens and glowing highlighters and nestled between foreign language dictionaries and Shakespearean sonnets is an empty space.

Once occupied by the undergraduate course Bible, the must-have reference tool for discovering the truth about professors and class intensity, the space in the bookstore now remains vacant.

The Penn Course Review is gone.

But don't panic -- the infamous guide is now free and online.

On Sept. 13, the Penn Course Review premiered its Web site: www.penncoursereview.com.

With a long and colorful history, the move to the Web continues the evolution of this Penn publication -- and the debate about its validity as a course prognostication.

While still under construction, staffers maintain that the site will soon be of the same quality as the printed version. At this point, the site offers reviews of about 200 courses -- less than half of what the printed editions contain. And there are no evaluations available for the School of Engineering and Applied Science.

Although it's been in the works for an entire year, the site was finally created over the summer by a hired Web designer for $3,000. The Review now only needs $200 per year to maintain its spot on the Web.

"If we're online, it's free for the students," Business Manager Will Lavery says. The College senior explains that the move not only eliminates cost, but the book's staffers no longer have to deal with the printing and distribution of the book -- a process that costs $30,000.

Sustained almost solely on the money raised through the selling of the Review, the organization upped the book's price by nearly $3 last year -- to a record-high $12.95 -- in order to meet climbing printing costs and build up funds for the move.

Whether online or in print, staffers maintain that the evaluations, based on roughly 17,000 student surveys, provide a valuable tool for choosing a schedule.

"It keeps us from walking into a class blind," says Melinda Sanchez, last year's co-editor-in-chief and current Fine Arts graduate student. "It's students telling us how it really was."

Students have been cluing others in since 1959, when the editors of The Daily Pennsylvanian first started the Review.

The then-33-page pamphlet was published in order to provide student descriptions of courses available on the undergraduate level.

It was sold for 25 cents per copy.

Across the page from an advertisement for $30 Dacron Blend Suits and $5 Wash and Wear Bermuda Shorts, the evaluations from that first year lacked the infamous number ratings on which countless undergraduates now rely.

The only information listed designated for whom the classes were designed -- engineers, College students or Whartonites. One other tidbit was provided about each course -- whether it targeted solely male students or if women were encouraged to attend as well.

While different from today's version, the innovative publication quickly became popular.

In 1960, the editors described the first edition as "highly successful," and began publishing the Review every semester.

Two years later, though, the Review returned to a thicker, yearly publication format.

The guide began to evolve as soon as the first edition was distributed to students. In the 1960 fall edition, the foreword explains that after its initial release, students began pressing for a more evaluative and critical view of course offerings, rather than simply a description of the material covered in each course.

And so, in 1960, the opinion and critique-oriented Penn Course Review -- then referred to as the Course Guide -- was born.

"I think the idea was borrowed from one of the other Ivies," says Anthony Lyle, 1960 DP editor-in-chief.

He adds that students found the 1959 undergraduate course listing provided by the University to be "very dry." The lack of student voice and evaluative stance in the listing is what spurred the idea for an alternate, student-created critique.

"It was useful to hear about what their peers had to say about the content of a course and the ability of an instructor," Lyle explains.

The original forms distributed to students are nearly identical to those used today. But, the rating scale was set from 1 to 5 (as opposed to the current 0 to 4 scale), and most of the questions asked about the quality of the course, its difficulty, the value of assigned reading and the instructor's teaching skills and competence in the subject.

Lyle notes that initially, some Penn professors "didn't cooperate" with the DP's requests to distribute evaluation forms and were weary of having the course and teaching instruction rated in such a manner.

But "eventually it did catch hold" and has since "helped to contribute to undergraduate life," he says.

As the years have gone by, the Review has continued to grow in length, course coverings and price.

By the 1961-1962 edition, the gender designations had disappeared.

By the 1965-1966 edition, the book had reached 96 pages.

By the 1966-1967 edition, the DP editors were receiving 2,000 returned surveys, and the book began to include its most popular feature up to that point -- the number of each letter grade received by students in the course.

In 1971, continuing with the group's efforts to re-vamp undergraduate advising, the Student Committee on Undergraduate Education took control of the publication. In later written histories, the transfer would be deemed a "hostile takeover."

For 11 years, SCUE produced the ever-changing Review. During those years, the book nearly doubled in length, survey respondents grew to over 10,000 and the addition of the numerical rating scale became a staple of the publication.

Spinal-bound and thicker than ever before, the guide was handed over to an autonomous group known as the Penn Course Review in 1982.

In 1985, due to a shortage in staff members, the Review was not produced. Forever known in Review history as the disastrous "year without the Course Guide," the absence of the book created a stir across campus.

The Undergraduate Assembly responded in 1986 to student concern over the book's disappearance by funding its publication for the next three years. With this push, the guide grew in size and started selling in the Penn bookstore in 1989.

The publication has remained fairly similar from that point forward, with the addition of four punchy, new pieces -- Perfect Professors, Notable Quotables and the infamous Halls of Fame and Shame.

With its move from print to the Web, the Review's editors intend to uphold the guide's traditional form.

But regardless of whether the review is online or in print, the evaluation process is the same.

Each semester, departments sort and scan thousands of bubble sheets to produce spreadsheets displaying the infamous number ratings reflected in each guide.

After completion of the calculations, the departments send the spreadsheets along with thousands of washed-out photocopied surveys to their respective umbrella institutions -- either the College, Wharton, SEAS or the Nursing School, which then deliver the collected information to the staff of the Review.

With each critique, professors and courses are branded with shining seals of approval or scathing rejections.

While the numbers may allow for quick evaluation and the witty quotes a gleeful laugh, does any of this actually help students find Penn's best courses and professors?

"The numbers are a very broad range of student opinion," which can result in "inflated positive and negative ratings," Mathematics Professor Todd Drumm explains.

Because the surveys encourage student comment, students are more inclined to respond if they either cherish or despise a course.

As a result, the more ambivalent opinions tend to get lost in the extreme glee and biting comments highlighted in the evaluations, according to Drumm.

Drumm also notes that readers should "worry about the return rate and the correlation between the grades received and the ratings," emphasizing that "there's no representation or accuracy with the data based on return rates below 50 percent."

Similarly, Chemistry Department Undergraduate Chairman Don Berry expresses some weariness over the reliability of the tool.

"Basically it's one piece of information, and while we certainly pay close attention to it, it's not necessarily the only measure of teaching quality," he says.

Overall, most departmental chairs and faculty have mixed feelings concerning the ability of the statistics to express the value of a course and the level of instruction.

They also state concerns over the possibility that evaluation writers might lean toward featuring more dramatic comments rather than mid-stream views of courses.

Even with these concerns over accuracy, administrators appear to view the guide as fairly harmless.

Regardless of her rating of 4 out of 4 and her consequent placement in the Perfect Professor category, History Professor Barbara Savage was fairly ambivalent on the subject.

"I think the statistics are a very imperfect way to measure teaching quality," she says, still concluding that "it certainly seems to be a useful tool for students."

Numerous faculty members note that courses are often discontinued or are frequently taught by different professors from semester to semester, thereby nullifying many of the ratings and course evaluations in the guide.

Generally, student opinion seems to parallel that held by faculty on the value and accuracy of the Review. The publication -- targeted mostly toward freshmen -- has noticeable weaknesses for upperclassmen.

College junior Eleanor Tai said that she finds the Review "helpful for certain classes, especially the larger ones," but explains that due to the rapid turnover of both professors and course subject matter, the majority of smaller courses offered each semester aren't highlighted correctly, if at all, in the Review.

Drumm also says that the guide is most valuable when partnered with peer consultation.

Overall, the consensus of students and professors alike is that while the Review may be worth checking out, it remains an imperfect method of choosing classes.

But whether it's a reliable tool or merely a form of entertainment, at least now it's free.

Comments powered by Disqus

Please note All comments are eligible for publication in The Daily Pennsylvanian.