A self-driving car that successfully passes tests worthy of a California driver’s license may sound like something out of a James Bond movie.
However, this very feat was achieved at Penn in the form of “Little Ben” — a self-driving Toyota Prius created in 2007 by Engineering professor Daniel Lee and a number of his colleagues in the School of Engineering and Applied Science.
Today, Little Ben is just one of several successful projects related to transportation created in the Engineering School and the General Robotics, Automation, Sensing and Perception Laboratory. From soccer-playing robots to a space-detecting system, faculty and students alike are increasingly using their skills for the creation of machines that look to innovate the world of transportation.
According to Lee, there are several difficulties involved in creating an intelligent machine like Little Ben. For example, he said, one of the main challenges that came with Little Ben was the machine’s inability to be as accurate as a human in making assumptions and “building up a representation of the world based on a cognitive viewpoint.”
“When we drive, we make assumptions about the world based on how other people are driving,” Lee said. “Computers have a hard time understanding these nuances.”
Therefore, he added, “the focus of our research is to think about algorithms that would make machines intelligent and make decisions like humans do.”
For other transportation-related projects, the inability of robots to perceive and process sensory stimuli like a human proved to be an added challenge. Lee said this difficulty has come to light with another invention — a group of soccer-playing robots.
“It’s the same sensory perception process — robots have to figure out what’s going on in the game, quickly make a decision based on the information and then actually act it out,” Lee said of the robots, which were created by UPennalizers, a group of Engineering students who participate in the annual Robocup Sony Aibo League. The competition involves humanoid robots playing soccer, albeit with slightly different rules.
While robots playing soccer may not seem relevant in mobility and transportation research, Engineering junior and UPennalizers member Ashleigh Thomas said the machines’ internal systems have a great deal of practical application.
She pointed to the systems involved in determining a robot’s location as an example of this.
“Computer vision has applications for anything mobile, for anything that needs to recognize things around it,” she said.
In addition to sensory perception, locomotion and stability are the other two systems that go into creating robots.
“In locomotion for humanoid robots, stability is the most difficult thing to achieve. Just simply having the robot not fall over is difficult,” Thomas said.
The code that the UPennalizers developed for their humanoid robots has a variety of uses outside the Robocup competitive circuit. According to Thomas, the UPennalizers open-sourced their code, meaning that it can be used by different robots across nine different platforms in mobility and transportation robotics.
Yida Zhang, a second-year Engineering graduate student, is one of several students conducting research based on robotic movement that uses this same open-source code.
His project revolves around free-space detection using vision. Free-space detection is the opposite of obstacle detection.
“Instead looking for obstacles in your way, you look for things not in your way,” Thomas explained.
In the end, Lee added, “Robots are created for any of the three D’s — dull, dirty or dangerous. Any future of robotics will be for these jobs.”