Emerging risks to humanity’s future
Modern science is well-acquainted with the idea of natural risks, such as asteroid impacts or extreme volcanic events, that might threaten our species as a whole. It is also a familiar idea that we ourselves may threaten our own existence, as a consequence of our technology and science. Such home-grown “existential risk” – the threat of global nuclear war, and of possible extreme effects of anthropogenic climate change – has been with us for several decades.
However, it is a comparatively new idea that developing technologies might lead – perhaps accidentally, and perhaps very rapidly, once a certain point is reached – to direct, extinction-level threats to our species. Such concerns have been expressed about artificial intelligence (AI), biotechnology, and nanotechnology, for example.
Technology and uncertainty
The common factor in such concerns is that the new capabilities of such technologies might provide direct and relatively short-term control over circumstances essential to our survival, and either place that control in dangerously few human hands, or take it out of our sphere of influence altogether, so that we cannot protect ourselves. The grounds for such concerns are presently difficult to assess. Relatively little work has been done on such problems, and experts in the fields in question often disagree. These uncertainties are themselves a ground for concern, given how much is at stake.
Investigation and mitigation
The Centre for the Study of Existential Risk is premised on the view that the task of investigating and mitigating such home-grown existential risks is a pressing and enduring responsibility for the scientific community; a task whose urgency and importance may be expected only to increase, as technology continues to develop. Yet there is at present little coherent sense of what this task amounts to – little sense of the necessary shape and components of practical and theoretical science of existential risk. Our aim is to construct and conceptualise this new science, and to begin developing a protocol for the investigation and mitigation of technologically-driven existential risk. For more information about our research areas and strategies, please see Research.