Think tank funded by Big Tech argues AI’s climate impact is nothing to worry about

  • 📰 TheRegister
  • ⏱ Reading Time:
  • 75 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 33%
  • Publisher: 61%

United Kingdom Headlines News

United Kingdom Latest News,United Kingdom Headlines

Sure, datacenters consume lots of energy. But maybe they'll invent stuff that helps

However, according to the Information Technology and Innovation Foundation's Center for Data Innovation , a Washington DC-based think tank backed by tech giants like Intel, Microsoft, Google, Meta, and AMD, the infrastructure powering AI isn’t a major threat.], the Center posited that many of the concerns raised over AI's power consumption are overblown and draw from flawed interpretations of the data.

But it’s also a number that’s hard to take at face value, because datacenters are complex systems. Measuring the carbon footprint or energy consumption of something like training or inferencing an AI model is therefore prone to error, the CDI study contends, without irony.by the University of Massachusetts Amherst that estimates the carbon footprint of Google's BERT natural language processing model.

There are a few reasons for this but one of them is that AI hardware is getting faster, and another is that the models that make headlines may not always be the most efficient, leaving room for optimization. The CID report's argument appears to be that past attempts to extrapolate power consumption or carbon emissions haven't aged well, either because they make too many assumptions, are based on flawed measurements, or they fail to take into account the pace of hardware or software innovation.The report offers several suggestions for how policymakers should respond to concerns about AI's energy footprint.

"Policymakers rarely consider that their demands can raise the energy requirements to train and use AI models. For example, debiasing techniques for LLMs frequently add more energy costs in the training and fine-tuning stages," the report reads."Similarly implementing safeguards to check that LLMs do not return harmful output, such as offensive speech, can result in additional compute costs during inference.

Source: Energy Industry News (energyindustrynews.net)

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 67. in UK

United Kingdom Latest News, United Kingdom Headlines