Moving bits, not watts

August 25, 2020

The phrase "too much of a good thing" may sound like a contradiction, but it encapsulates one of the key hurdles preventing the expansion of renewable energy generation. Too much of a service or commodity makes it harder for companies to sell them, so they curtail production.

Usually that works out fine: The market reaches equilibrium and economists are happy. But external factors are bottlenecking renewable electricity despite the widespread desire to increase its capacity.

UC Santa Barbara's Sangwon Suh is all too familiar with this issue. The professor of industrial ecology has focused on it and related challenges for at least the past two years at the Bren School of Environmental Science & Management. "Curtailment is the biggest problem of renewable energy we are facing," said Suh, who noted it will only escalate as renewable energy capacity increases.

Now Suh, along with Bren doctoral student Jiajia Zheng, and Andrew Chien at the University of Chicago, have presented an innovative proposal to address this issue by routing workloads between data centers in different regions. The concept, published in the journal , is cheap, efficient and requires minimal new infrastructure. Yet it could reduce thousands of tons of greenhouse gas emissions per year, all while saving companies money and encouraging the expansion of renewable energy.

The main roadblock

Curtailment comes into play when renewable energy sources generate more electricity than is required to meet demand. Modern power grids balance energy supply and demand in real-time, every minute of every day. Extra electricity would overwhelm them, so it needs to be either stored, sold off or curtailed.

This occurs because reliable energy sources -- like fossil fuel and nuclear power plants -- are critical to grid stability, as well as meeting nighttime demand. These facilities have to operate above a minimum capacity, since shutting down and restarting them is both costly and inefficient. This sets a minimum for electricity from conventional power sources, and if renewables continue to generate more power, then the extra energy is effectively useless.

California is a case study in the challenges of variable renewable electricity and the problem of curtailment. Presumably the state could sell its surplus electricity to neighbors. Unfortunately, many power grids are encountering the same problem, and the transmission network has limited capacity. As a result, the state has resorted to selling excess electricity at a negative price, essentially paying other states to take the energy.

There are two other solutions for dealing with excess electricity aside from simply curtailing energy generation, Suh explained. Energy can be stored in batteries and even hydroelectric reservoirs. That said, batteries are incredibly expensive, and hydropower storage is only suitable for certain locations.

The other option is to use the extra electricity to generate things of value that can be used later. "Whatever we produce will have to be stored and transported to where it's needed," Suh pointed out. "And this can be very expensive.

"But," he added, "transporting data and information is very cheap because we can use fiber optics to transmit the data literally at the speed of light." As the authors wrote in the study, the idea behind data load migration is "moving bits, not watts."

An innovative idea

The task ahead of the authors was clear. "The question we were trying to answer was can we process data using excess electricity?" Suh said. "If we can, then it's probably the cheapest solution for transporting the product or service made using excess electricity."

Currently, Northern Virginia hosts most of the nation's data centers. Unlike California's grid, CAISO, the grid Northern Virginia sits on, PJM, relies heavily on coal-fired power plants, "the dirtiest electricity that we can ever imagine," in Suh's words.

Suh, Zheng and Chien propose that workloads from the PJM region could be sent to centers out west whenever California has excess electricity. The jobs can be accomplished using electricity that otherwise would have been curtailed or sold for a loss, and then the processed data can be sent wherever the service is needed. Data centers usually have average server usage rates below 50%, Zheng explained, meaning there is plenty of idle capacity ready to be tapped.

This plan is not only environmentally sound; it represents significant savings for the companies using these services. "This approach could potentially save the data center operators tens of millions of dollars," said lead author Zheng. Since the electricity would otherwise have been useless, its cost to the company is essentially zero.

The authors analyzed historical curtailment data of CAISO from 2015 through 2019. They found that load migration could have absorbed up to 62% of CAISO's curtailed electricity capacity in 2019. That's nearly 600,000 megawatt-hours of previously wasted energy -- roughly as much electricity as 100,000 Californian households consume in a year.

At the same time, the strategy could have reduced the equivalent of up to 240,000 metric tons of CO2 emissions in 2019 using only existing data center capacity in California. "That is equivalent to the greenhouse gas emissions from 600 million miles of driving using average passenger vehicles," Suh said. And, rather than costing money, each ton of CO2 emissions averted by switching power grids would actually provide around $240 in savings due to decreased spending on electricity.

Untapped potential

These findings were within what the authors expected to see. It was the ramifications that amazed them. "What surprised me was why we were not doing this before," Suh said. "This seems very straightforward: There's excess electricity, and electricity is a valuable thing, and information is very cheap to transmit from one location to another. So why are we not doing this?"

Suh suspects it may be because data center operators are less inclined to cooperate with each other under current market pressures. Despite the environmental and financial benefits, these companies may be reluctant to outsource data processing to a facility run by a different firm.

In fact, the data center industry is somewhat of a black box. "It was very challenging for us to get detailed information on the power usage characteristics and energy consumption data from the industry," said Zheng.

Harnessing the potential of curtailed renewable energy will require fluid coordination between the data center operators. Shifting the system may require changing the incentives currently at work. This could take the form of new regulations, a price on carbon emissions or collaborations between rival companies.

"Two different things need to happen in parallel," Suh said. "One is from the private sector: They need to cooperate and come up with the technological and managerial solutions to enable this. And from the government side, they need to think about the policy changes and incentives that can enable this type of change more quickly."

A widespread price on carbon emissions could provide the necessary nudge. California already has a carbon price, and Suh believes that, as additional states follow suit, it will become more economically attractive for companies to start using the strategies laid out in this study.

And these strategies have huge growth potential. Data processing and renewable electricity capacity are both growing rapidly. Researchers predict that the datasphere will expand more than fivefold from 2018 to 2025. As a result, there is a lot of room for data centers to absorb additional processing needs using excess renewable energy in the future.

This paper offers only a conservative estimate of the financial and environmental benefits of data load migration, Suh acknowledged. "As we increase the data center capacity, I think that the ability for a data center to be used as a de facto battery is actually increasing as well," he said.

"If we can think ahead and be prepared, I think that a substantial portion of the curtailment problem can be addressed in a very cost-effective way by piggybacking on the growth of data centers."
-end-


University of California - Santa Barbara

Related Science Articles from Brightsurf:

75 science societies urge the education department to base Title IX sexual harassment regulations on evidence and science
The American Educational Research Association (AERA) and the American Association for the Advancement of Science (AAAS) today led 75 scientific societies in submitting comments on the US Department of Education's proposed changes to Title IX regulations.

Science/Science Careers' survey ranks top biotech, biopharma, and pharma employers
The Science and Science Careers' 2018 annual Top Employers Survey polled employees in the biotechnology, biopharmaceutical, pharmaceutical, and related industries to determine the 20 best employers in these industries as well as their driving characteristics.

Science in the palm of your hand: How citizen science transforms passive learners
Citizen science projects can engage even children who previously were not interested in science.

Applied science may yield more translational research publications than basic science
While translational research can happen at any stage of the research process, a recent investigation of behavioral and social science research awards granted by the NIH between 2008 and 2014 revealed that applied science yielded a higher volume of translational research publications than basic science, according to a study published May 9, 2018 in the open-access journal PLOS ONE by Xueying Han from the Science and Technology Policy Institute, USA, and colleagues.

Prominent academics, including Salk's Thomas Albright, call for more science in forensic science
Six scientists who recently served on the National Commission on Forensic Science are calling on the scientific community at large to advocate for increased research and financial support of forensic science as well as the introduction of empirical testing requirements to ensure the validity of outcomes.

World Science Forum 2017 Jordan issues Science for Peace Declaration
On behalf of the coordinating organizations responsible for delivering the World Science Forum Jordan, the concluding Science for Peace Declaration issued at the Dead Sea represents a global call for action to science and society to build a future that promises greater equality, security and opportunity for all, and in which science plays an increasingly prominent role as an enabler of fair and sustainable development.

PETA science group promotes animal-free science at society of toxicology conference
The PETA International Science Consortium Ltd. is presenting two posters on animal-free methods for testing inhalation toxicity at the 56th annual Society of Toxicology (SOT) meeting March 12 to 16, 2017, in Baltimore, Maryland.

Citizen Science in the Digital Age: Rhetoric, Science and Public Engagement
James Wynn's timely investigation highlights scientific studies grounded in publicly gathered data and probes the rhetoric these studies employ.

Science/Science Careers' survey ranks top biotech, pharma, and biopharma employers
The Science and Science Careers' 2016 annual Top Employers Survey polled employees in the biotechnology, biopharmaceutical, pharmaceutical, and related industries to determine the 20 best employers in these industries as well as their driving characteristics.

Three natural science professors win TJ Park Science Fellowship
Professor Jung-Min Kee (Department of Chemistry, UNIST), Professor Kyudong Choi (Department of Mathematical Sciences, UNIST), and Professor Kwanpyo Kim (Department of Physics, UNIST) are the recipients of the Cheong-Am (TJ Park) Science Fellowship of the year 2016.

Read More: Science News and Science Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.