
Written by Dr David Storrs-Fox and Dr Emma Curran, with thanks to Dr Jonas Bozenhard.
Last month, the Institute for Ethics in AI co-sponsored a conference at Hamburg University of Technology (TUHH) on climate, responsibility and technology. Hosted by TUHH’s Institute for Ethics in Technology, the conference brought philosophers into conversation with engineers and others to explore the ethical dimensions of climate change. Oxford supplied three of the four speakers: Alice Evatt from the Environmental Change Institute, and Emma Curran and David Storrs-Fox from the Institute for Ethics in AI. They were joined by Matthew Braham from the University of Hamburg.
Evatt began the conference by arguing that climate change constitutes an emergency, and exploring the ethical implications of that claim. She argued that the longer we wait to take adequate action, the more extreme the required response will be. Recognizing that climate change is an emergency does not necessarily require adopting the most extreme solutions (such as solar geoengineering), but it does require urgent implementation of a portfolio of different approaches, including emissions reduction and removal, and CO2 storage. Evatt also discussed the possibility that AI might be deployed to address climate change.
Braham addressed the moral responsibility of agents in situations of strategic interaction, using as a central example the responsiveness of climate policy in one country to the expected climate policies of another country. Braham argued that such situations might give rise to responsibility gaps, in which no country is responsible for climate harms – but paradoxically, uncertainty about the path other countries are taking can also ground a country’s moral responsibility for climate harms.
Curran discussed situations in which it seems we are required to save a smaller number of people rather than a larger number, with applications for how we ought to prioritise among different generations who might be affected by climate harms. She considered possible principles to make sense of such cases, including the thought that there is something especially morally weighty about “dooming” a person, and the thought that the presence of others (such as non-profit organisations) makes a difference to what we ought to do. The talk suggested some lessons for the popular longtermism movement in moral philosophy.
Storrs-Fox addressed the concern that although severe climate harms are caused collectively by human agency, there is a severe shortfall in the moral responsibility that is attributable for such harms. Responsibility shortfalls seem to occur in many situations of collective action, and AI might give rise to even more responsibility shortfalls. However, it is unclear why such shortfalls should matter morally, and (if they do matter) what their moral implications are. Storrs-Fox suggested a framework for understanding and assessing the importance of responsibility shortfalls, drawing on resources in moral psychology.
The discussions that followed each talk centered around several major themes: first, the diverse and manifold ethical foundations of climate change concern; second, the problem of appropriately allocating responsibility for climate harm; and, third, potential roles for technology and policy solutions in climate action.
The event was part of the UNU (United Nations University) Hub on Engineering to Face Climate Change at TUHH, and received generous funding from the Deloitte-Stiftung.
- TUHH website: https://www.tuhh.de/ethics/
- UNU Hub: https://www.tuhh.de/unuhub/