As demand for data centers increases nationwide, Penn faculty are researching sustainable pathways to accommodate the energy needs of artificial intelligence.
Data centers — which accounted for 4% of United States energy consumption in 2024 — are designed to keep up with the increasing computational power required to power AI. In interviews and statements to The Daily Pennsylvanian, Penn faculty and researchers discussed the significant economic and environmental impacts of these facilities.
John Quigley, senior fellow at the Kleinman Center for Energy Policy, explained how data centers “of unprecedented size” consume vast amounts of electricity, impact the local water supply, and drive up electricity prices.
“A typical data center can consume as much energy as a city, potentially millions of square feet under roof,” he said. “Buildings that large change the character of communities.”
On the ground, Penn faculty are researching ways to lower costs and optimize energy use through engineering and policy solutions.
In a statement to the DP, Engineering professor Benjamin Lee outlined several short and long-term solutions to improve sustainability, including improving “power accounting and transparency” and accelerating investment in renewable energy infrastructure.
“Software solutions could include more efficient AI models that compute accurate answers with fewer calculations,” Lee wrote. “Hardware solutions could specialize systems to require less energy per calculation.”
Lee is co-director of Carbon Connect, a National Science Foundation project aiming to “rethink computing infrastructure, from semiconductors to data centers, for sustainability and efficiency.”
RELATED:
Penn junior wins national award for computing research
Penn GSE, partners to invest $26 million in educational artificial intelligence
Quigley — who has served as a cabinet member to two Pennsylvania governors — emphasized the role of state-level policy incentives in “propelling the clean energy transition” and discouraging fossil fuel generation.
“These facilities need to pay their own way in every respect, both in terms of generation and infrastructure transmission,” Quigley said. “Then we need to be requiring things like load flexibility, so that when, for example, the grid is under high stress … we don’t create blackouts and push prices up through the stratosphere.”
Quigley described how data centers in Pennsylvania can receive tax exemptions from the state government, which “is projected to cost Pennsylvania at least a quarter of a billion dollars in the next few years.”
In a recent budget proposal, Pennsylvania Gov. Josh Shapiro introduced a policy requiring data center developers to pay for their own power and “commit to strict transparency standards and direct community engagement.”
The plan — which maintains tax credits and permits for data centers — drew criticism from some environmentalists.
Engineering professor Ali Zaidi — who previously served as national climate advisor under Former President and Benjamin Franklin Presidential Practice Professor Joe Biden — emphasized how AI could be used to accelerate a transition to clean energy sources.
“We have a moment where we could go way big on clean energy, and we’re coming off of, year after year, record-setting deployments of solar, batteries, and wind,” Zaidi said. “Why not use the AI economy as the demand anchor to generate some of the additional reforms that we need in terms of reducing permitting time and clarifying the rules of the grid and shaping the system to be more pro-consumer?”
He also stressed the importance of expanding the grid’s capacity and optimizing current infrastructure.
“The road to lower rates for our consumers and higher competitiveness for our industry runs through expanding the amount of clean electricity that can connect to and deliver on our grid,” Zaidi said.
He added, “I see it as a massive opportunity for us, and we have all of the building blocks.”
In a December 2025 paper, a group of Penn Engineering researchers offered a solution to growing environmental concerns. In the proposed model — which features solar panels and a collection of central processing units and graphics processing units — researchers described how a space-based system could provide a stable source of energy while orbiting the Earth.
“You can pick orbits that fly over what’s called the terminator line, which is the imaginary line that separates the dark side and the lit side of Earth,” Engineering professor Igor Bargatin said. “And if you fly over that all the time, you will never experience any shadowing from the earth.”
The system, however, risks being “pelted with small particles, kind of like a sandstorm,” Bargatin explained. Third-year Engineering graduate student Dengge Jin — who co-authored the paper — helped simulate possible disruptions.
Jin said the tethers, which naturally hold pretension due to the difference between the pull of gravity towards Earth and centrifugal force in the opposite direction, can resist rotation caused by these collisions.
“Let's say a micrometeorite hits a single PV panel,” Jin said. “It will cause this node to rotate, but because you have very high rotational resistance, this will propagate through the whole system at a very fast speed and reduce the rotational amplitude.”
Several factors still need to be considered, according to Bargatin — including deployment, temperature stability, the effect of radiation in space, and “space traffic.” To make a meaningful impact, he explained, space data centers may require that the number of satellites in space increase from thousands to millions.






