AI and machine learning could be good or bad for the future of climate

AI and machine learning could be good or bad for the future of climate
oooussama

Artificial intelligence is killing the planet. Wait, no – he’ll save it. According to Hewlett Packard Enterprise VP of AI, HPC Evan Sparks, and Ameet Talwalkar Professor of Machine Learning at Carnegie Mellon University, it’s not entirely clear what AI can do to or to our planet.

Speaking at the SixFive Summit this week, the duo discussed one of the most controversial challenges facing AI/ML: the impact of technology on climate.

“What we have seen over the past few years is that machine learning technology that really requires computation is becoming increasingly prominent in the industry,” said Sparks. “This has heightened concerns about the concomitant increase in energy use and associated concerns – not always cleanly – about carbon emissions and the carbon footprint of these workloads.”

Sparks estimates that AI/machine learning workloads account for more than half of all computing demands today.

Talwalkar, who is also an AI researcher at HPE, said citing an OpenAI blog post from 2018 that showed the model’s computing and power requirements have increased more than 300,000 times since 2012.

“That’s a number that, at this point, is about four years old, but I think the trend continues in similar directions,” he added.

However, the fact that the broader machine learning community is even considering the impact of AI on climate is a promising sign, Talwalkar noted.

“This was not something we were really thinking about in the machine learning community a few years ago,” he said. “It’s good to go ahead with this issue and put pressure on ourselves as a community.”

It’s not too late yet

Talwalkar explained that addressing the environmental ramifications of the spread of AI first requires a better understanding of the problem itself.

“This means learning to precisely measure the degree to which this problem occurs in terms of the power requirements of current AI workloads, as well as coming up with accurate predictions of what we expect future requirements to look like,” he said, adding that these insights will not only help researchers understand the true cost of the workload. , but also taking steps to develop more efficient hardware and improve algorithms.

“We are in the midst of a hardware proliferation in terms of specialized hardware designed specifically for training and/or deployment of machine learning models,” he said, citing Google’s tensor processing unit as an early example and noting the ongoing efforts of Nvidia, Graphcore, Cerebras, and others to develop new machine learning hardware and workloads. The work of artificial intelligence.

Sparks noted, “It’s tempting to come up with more hardware to solve the problem, but at the same time I think as a research community we’re really pushing the envelope along with computational advances,” noting the equal importance of software.

In this regard, Talwalkar argues that a better understanding of how and why deep learning models work can pay off for optimizing algorithms to get more performance from available computing resources.

Artificial intelligence is in its infancy

Despite the challenges, Talwalkar remains optimistic that society will rise to the occasion, and as technology matures, less focus on what we can do with these workloads and increase efforts to improve them.

“We are definitely in the early days of artificial intelligence,” he explained. “It seems like we’re seeing pretty amazing new apps coming up every day.”

Talwalkar believes that AI and machine learning will follow a path not unlike that of the Human Genome Project – a very costly endeavor that laid the foundation for low-cost gene sequencing that has proven very useful.

And while Sparks expressed similar optimism, he doesn’t expect AI/machine learning growth to slow down anytime soon. “At least in the next few years, we’ll see a lot – not much less.” ®

#machine #learning #good #bad #future #climate

sidaliii