This article is part of a series of contents that anticipate the themes that will be discussed during the second edition of domusforum - the future of cities, in Milan on October 10, 2019.
For many years we have been listening to the scientific community illustrating almost in unison how artificial intelligence will allow great energy savings and therefore generate a more eco-sustainable society: autonomous vehicles, intelligent buildings and self-adaptive utilities will lead to the creation of smart cities in which sustainability and efficiency will be the order of the day and every waste banned.
But, beyond the precautionary exercise of doubt to which every major change should be submitted as a matter of principle, today the scientific community expresses some reservations not so much about the final result of the revolution in progress, but about its costs in the short, very short term: in recent months major scientific journals have published the opinions of researchers who warn about the huge consumption required by the implementation of new technologies and especially the increase easily predictable for the coming years.
The online edition of the MIT Technology Review, for example, hosted on July 29th a statement by the CEO of Applied Materials Gary Dickerson, one of the most influential personalities of the Silicon Valley. According to Dickerson, A.I. could be responsible for one tenth of the world's electricity consumption as soon as 2025. In essence, machine learning could represent – in a dramatic paradox – a serious threat to the ecosystem.
We asked Lorenzo Natale, senior researcher at the Italian Institute of Technology and expert of Artificial Intelligence applications in the field of robotics, for his opinion.
A recent study estimated that the work needed to develop a large neural network causes an emission of greenhouse gases comparable to the life cycle of five average American cars
"The concerns are undoubtedly well-founded," says Natale, "just think that a recent study by the University of Massachusetts estimated that the work needed to develop a large neural network causes an emission of greenhouse gases comparable to the life cycle of five average American cars (including the manufacturing). But the fact that we're openly talking about it is a good sign that we're indeed aware of the problem".
How do we get out of this vicious circle then, especially in view of the expected acceleration of the near future? Again the expert: "Today, A.I. is a field of intensive research, in which science aims at the maximum possible exploration without thinking too much about the cost-benefit ratio. In the future, when practical application takes precedence over purely experimental research, this framework should change in favor of greater parsimony of resources. I must however use the conditional in such a fluid field".
In general, it is precisely the experimental phase that implies great consumption. We can divide the life of a neural network into three phases: prototyping, training and inference. "The third is that of the actual use of the network and requires little energy," says the IIT researcher, "while the first two, especially the first one, are extremely expensive because they require immense volumes of calculation in the designing of the network and in its adaptation to the tasks required".
In the future, when practical application takes precedence over purely experimental research, this framework should change in favor of greater parsimony of resources
With a view to a greater system efficiency, there are two fields in which to operate: on the one hand, new and better performing materials are needed, and on the other hand, better algorithms must be developed that are able to give the same answers with a lower volume of calculation.
However, it is not easy to say which are the main agents of change. Big tech giants are often associated with unethical behaviors but ultimately they too pay their bills and therefore have a strong interest in rationalizing consumption (for example regarding cloud services), as well as automotive companies engaged in the development of self-driven vehicles; universities and public research centers on the contrary do pure research and are more interested in achieving the result itself rather than in its practical efficiency. But the opposite may also be true: academic research often suffers from a lack of funds and these limitations may lead to a new line of work that maximises performances while containing energy consumption and therefore environmental impact.
"Although it is almost impossible to make predictions at the moment," concludes Lorenzo Natale, "it is essential not to make the same mistake as in the last 150 years. If we underestimate again the impact that our technologies have on the ecosystem, this time we may not have a way to remedy.