Artificial intelligence can be a great tool to help solve environmental problems but if not designed and used carefully it can make those problems worse. (Image: sippapas/Adobe Stock)
Yes, It Can Help

What AI can do best is quickly sort through mountains of data to harvest meaningful information. Its ability to analyze tons of data can help officials create more effective policies for addressing climate issues by spotting trends that would otherwise go unnoticed.

And, it can be used to optimally manage the electrical grid to enable renewables to be smoothly and safely integrated. It can analyze supply and demand for energy and use that analysis to automatically produce the optimum balance. For example, smart devices can be scheduled to run when measured demand is at a minimum.

The World Economic Forum listed specific ways AI is currently helping  with climate problems.

  • Measuring changes in icebergs, which is an indication of how much water is being added to the ocean.
  • Analyzing deforestation locations and rates and along with drones reseeding needed locations.
  • Predicting weather patterns.
  • Analyzing waste processing and recycling facilities.
  • Creating detailed maps of ocean litter.
  • Predicting impending climate disasters for specific locations.
  • Predicting wind farm output.
  • Enabling companies to track, trace, and reduce their emissions.

In these ways and more, AI is a value tool in the fight to minimize environmental damage.

Yes, It Can Hurt

Let me start with a “short story” from a 2018 essay  by Kate Crawford and Vladan Joler, “Anatomy of an AI System.” They ask the reader to picture a woman walking into a room carrying a sleeping child. She gives a voice command to her virtual assistant: “Turn on the hall lights.”

But what is entailed in this simple interaction? The virtual assistant is “… a disembodied voice that represents the human-AI interaction interface for an extraordinarily complex set of information processing layers. These layers are fed by constant tides: The flows of human voices being translated into text questions, which are used to query databases of potential answers, and the corresponding ebb of Alexa’s replies. For each response that Alexa gives, its effectiveness is inferred by what happens next. Put simply: Each small moment of convenience — be it answering a question, turning on a light, or playing a song — requires a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data. The scale of resources required is many magnitudes greater than the energy and labor it would take a human to operate a household appliance or flick a switch.”

Every time AI is put to use, it initiates a chain of events that have far-reaching consequences, but well before it even begins to be usable, it requires vast amounts of resources. For artificial intelligence to respond to “natural language,” it has to be trained on large amounts of data. One paper  compared the environmental cost in pounds of CO2 to other consumption and estimated that training a single typical natural language processing (NLP) model produces about the same amount of CO2 as five automobiles over their lifetime.

Large data centers are needed to keep AI models running, and they have to be on at full capacity 24 hours a day. These not only require large amounts of electricity, but they also require large amounts of fresh water for cooling. One article  suggests that by the year 2027, the data processing needed for generative AI will require more than four times as much water as Denmark currently consumes annually.

Generative AI such as ChatGPT multiplies the energy requirements exponentially. According to Dr. Kate Crawford  , “One assessment suggests that [it] is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.”

Add to those effects the increased demand for diminishing supplies of various chemicals and minerals and the problem of disposing of electronic devices containing hazardous materials to keep up with the rapidly changing technology.

What Should We Do?

Since AI is clearly here to stay and will be growing, it is vital that we address its negative environmental impacts while taking advantage of its abilities to help mitigate environmental damage.

Minimizing adverse environmental impacts must be included in the design objectives for both AI hardware and software right from the beginning — it cannot be an afterthought.

An article in the journal Nature Machine Intelligence  offers several useful suggestions. First and foremost, in my opinion, is to consider whether your problem really needs an AI solution or can be solved with conventional means.

According to Deepika Sandeep, who heads the AI and ML program at Bharat Light & Power (BLP), a Bengaluru, India-based clean-energy-generation company, “Using renewable energy grids for training neural networks is the single biggest change that can be made. It can make emissions vary by a factor of 40, between a fully renewable grid and a fully coal grid.”

Researchers at the University of Massachusetts Amherst, who have been studying the problem, encourage designers “to prioritize computationally efficient hardware and algorithms, to report training time and sensitivity to hyperparameters in published performance results, and to perform a cost-benefit analysis of NLP models for comparison.”

Engineers and computer scientists have the great ability to develop innovative solutions for difficult problems — what is needed is for them to include environmental problems in whatever AI project they are undertaking.