High-performance computing
High-performance computing is becoming the lifeblood of artificial intelligence research. (Intel Photo)

The development of ever more powerful models for artificial intelligence is revolutionizing the world, but it doesn’t come cheap. In a newly distributed position paper, researchers at Seattle’s Allen Institute for Artificial Intelligence argue that more weight should be given to energy efficiency when evaluating research.

The AI2 researchers call on their colleagues to report the “price tag” associated with developing, training and running their models, alongside other metrics such as speed and accuracy. Research leaderboards, including AI2’s, regularly rate AI software in terms of accuracy over time, but they don’t address what it took to get those results.

Of course, cutting-edge research can be expensive in all sorts of fields, ranging from particle physics done at multibillion-dollar colliders to genetic analysis that requires hundreds of DNA sequencers. Financial cost or energy usage isn’t usually mentioned in the resulting studies. But AI2’s CEO, Oren Etzioni, says that times are changing – especially as the carbon footprint of energy-gobbling scientific experiments becomes more of a concern.

“It is an ongoing topic for many scientific communities, the issue of reporting costs,” Etzioni, one of the position paper’s authors, told GeekWire. “I think what makes a difference here is the stunning escalation that we’ve seen” in the resources devoted to AI model development.

One study from OpenAI estimates that the computational resources required for top-level research in deep learning have increased 300,000 times between 2012 and 2018, due to the rapid development of more and more complex models. “This is much faster than Moore’s Law, doubling every three or four months,” Etzioni said.

When it comes to energy requirements, Etzioni doesn’t want to see AI research go the way of bitcoin mining, which is already putting a strain on power companies. But it’s not just the environment he’s worried about. The financial cost of running a project such as Google DeepMind’s AlphaGo game-learning program can amount to more than $1,000 an hour.

“There is an important issue of inclusiveness, where [not only] people from emerging economies, but even students and academics and startups can get increasingly shut out of the cutting edge if it’s the case that you need, say, a billion dollars … to do cutting-edge AI research,” he said.

Oren Etzioni
Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence, argues that startups would benefit from a “Green AI” approach to research. (GeekWire Photo / Alan Boyle)

AI2’s researchers emphasize that they’re not calling for an end to the high-resource, high-cost approach, which they call “Red AI.” Rather, they want to turn a brighter spotlight on “Green AI” – an approach that aims to do cutting-edge AI with greater efficiency.

“If you make AI greener, it’s not just cheaper, but it opens the way toward more efficient techniques to further push the state of the art,” Etzioni said.

Such techniques could come closer to matching the workings of the human brain, which far outdoes any AI model in terms of general performance and efficiency.

“The carbon footprint of our thinking is a salad and maybe an occasional taco,” Etzioni joked.

He and his colleagues propose including the total number of floating point operations required to reach a given result as a routine part of research papers. They argue that floating point operations, or FPOs, serve as a raw tally of computational power and would be a more sensible metric than, say, carbon emissions, electricity usage or elapsed real time.

Researchers could also track how accuracy increases as a function of budget. “Reporting this curve will allow users to make wiser decisions about their selection of models and highlight the stability of different approaches,” they write.

They also hail the trend toward the public release of pre-trained models, such as Google’s BERT and XLNet models, as a “green success.”

“We would like to encourage organizations to continue to release their models in order to save others the costs of retraining them,” the researchers write.

Etzioni said he’s gotten positive feedback since this week’s release of the position paper, and hopes that more efficient research models will give rise to less expensive and more efficient AI applications as well. One of AI2’s spin-outs, XNOR.ai, is already making energy efficiency a priority for its products.

“Deployment is highly relevant,” Etzioni said. “I feel like there’s a healthy economic incentive there.”

The authors of “Green AI” include Etzioni, Roy Schwartz, Jesse Dodge and Noah Smith.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.