AI and the Environment: A Global North/South Divide

 

AI can bring benefits globally but it is energy intensive, and combined with existing inequalities in infrastructure it could reinforce power imbalances. By Antonio Ballesteros-Figueroa.

For most of us, checking the weather app on our phones is part of our daily routine. We want to reduce the uncertainty in our lives as much as possible. Similarly, farmers around the world depend on long-term meteorological forecasts to reduce the current uncertainty in highly variable systems. AI systems enable this by allowing the integration of algorithmic models that utilise big data processed by supercomputers.

Forecasting is not limited to the weather. For instance, during the Covid-19 pandemic, groups of researchers1,2 have been trying to use machine-learning algorithms to forecast if a person might develop severe illness, based on the first day’s symptoms.

All this forecasting activity comes with a cost that is both economic and environmental. It requires an energy-intensive infrastructure of supercomputers and server farms to process the huge amounts of data involved. The need for this infrastructure not only has a physical impact on the planet – it can also have profound social and political consequences.

Climate-related conflict

At the Uppsala University Department of Peace and Conflict Research (PCR) in Sweden, as part of their Violence Early Warning System (ViEWS) they aim to forecast climate-related conflicts. In particular, the goal is to predict how agricultural changes linked to droughts could affect human displacement within the next 100 years. All of these AI forecasting tools expect to influence decision making, not only of the affected communities, but of every stakeholder.

Yet while AI tools could help to improve policy decision making, they can also reinforce existing unequal power relations between the Global North and the Global South. Interviews I conducted with ViEWS' members while a Visiting Researcher at PCR, for example, reveal that these tools represent a new way in which rich countries can impose policies without developing nations having the capacity of challenging them. Existing infrastructural inequalities, together with a lack of participation from communities in the Global South, make it almost impossible to replicate these tools outside of rich nations.

Existing infrastructural inequalities might also make it unrealistic to recreate AI projects from scratch. For instance, one ViEWS' member told me: “The supercomputer capacity that we are using every month is more than most African countries possess. Most African countries don't possess any access to a supercomputer technique. I know of five European countries, at the country-level, that would not be able to replicate ViEWS because they don't have the infrastructure… That in itself is a problem. You need to run a project like this in a country like this [Sweden] because otherwise there is no money to do it.”

While ViEWS is produced in Sweden, the drought forecasts are focused on East Asia and Africa. What then, does this say about the technical capacity of the affected farmers? The issue isn't only one of technical restrictions. It is also about understanding that "science as development, plan, experiment, pedagogy determines the life chances of a variety of people”.3

Challenging and developing

The people whose lives might be affected by these forecasts should have some agency in the process of developing these systems. The reality, however, is that AI projects often ignore the fact that, as well as the scientists, communities need to have the possibility of challenging both the forecasting methods and the results.

Another issue that makes it difficult for local communities to participate is ignorance around meta processes. All programming languages operate in a balance of defined rules and individuality. A standard approach to coding would make for a much more transparent and explainable process, but the idea of coding ‘hygiene’ tends to be depicted as something that goes against the nature of programming. When I asked another ViEWS member about standardisation in this area, it was dismissed: “It’s like asking a poet or a writer to standardise their writing – it will never happen.”

What is important, the argument goes, is that individuals are allowed to solve everyday programming problems. Yet if individualism is preferred over standardisation, this increases the need for contextualising the rationale behind every decision that is taken. However, mundane, everyday decisions are not usually recorded. Therefore, understanding how algorithms are produced and the thinking behind them is even more difficult.

It’s clear that tools produced using machine learning or artificial intelligence could increase the possibility of dealing with big data. Yet a combination of existing infrastructural inequalities and the nature of programming instead reinforces existing colonial and patronising attitudes between the Global North and South. While infrastructural issues might not be solvable in the short term, the increased participation of local communities in how AI is produced could help to diminish these attitudes. 


1. Knight, S. R. et al. 2020. ‘Risk stratification of patients admitted to hospital with covid-19 using the ISARIC WHO Clinical Characterisation Protocol: development and validation of the 4C Mortality Score’. BMJ 370: m3339. https://doi.org/10.1136/bmj.m3339https://www.ncbi.nlm.nih.gov/pubmed/32907855.

2. Menni, C. et al. 2020. ‘Real-time tracking of self-reported symptoms to predict potential COVID-19’. Nat Med 26 (7): 1037-1040. https://doi.org/10.1038/s41591-020-0916-2. https://www.ncbi.nlm.nih.gov/pubmed/32393804.

3. Visvanathan, S. 2005. ‘Knowledge, justice and democracy’. In Science and Citizens, edited by Melissa Leach, Ian Scoones and Brian Wynne, in Claiming citizenship: rights, participation and accountability, London: Zed Books.


 
Previous
Previous

AI, Ethics, and the Role of the Artist

Next
Next

Preternatural: AI and art