New Public Lecture Video – Dr Heather Roff

6 October 2017

Dr Heather Roff – Nukes of Hazard: Mapping the Relative Risks Emerging Technologies Pose to Nuclear Weapons Systems

Dr Roff, Senior Research Fellow in the Department of Politics and International Relations at the University of Oxford, discusses the risks the emerging technologies of artificial intelligence and automation pose to nuclear modernisation.

Shahar Avin on NonProphets podcast

6 October 2017

In this episode (recorded 9/27/17), the superforecasters of the NonProphets podcast interview CSER’s Dr. Shahar Avin.

They discuss the prospects for the development of artificial general intelligence; why general intelligence might be harder to control than narrow intelligence; how we can forecast the development of new, unprecedented technologies; what the greatest threats to human survival are; the “value-alignment problem” and why developing AI might be dangerous; what form AI is likely to take; recursive self-improvement and “the singularity”; whether we can regulate or limit the development of AI; the prospect of an AI arms race;  how AI could be used to be undermine political security; Open AI and the prospects for protective AI; tackling AI safety and control problems; why it matters what data is used to train AI; when will have self-driving cars; the potential benefits of AI; and why scientific research should be funded by lottery.

Read more and listen here.

Global Catastrophic Risks 2017

29 September 2017

Global Catastrophic Risks 2017 is an annual analysis of the greatest threats to humanity produced by the Global Challenges Foundation. It is based on the latest scientific research and features contributions from leading experts at think tanks, university departments and other institutions worldwide. As well as exploring the risks themselves, it summarizes the current status of global efforts to manage them.

Our co-founder Lord Martin Rees provided the introduction, What is a Global Catastrophic Risk?

Should We Care About The Worst-Case Scenario When It Comes To Climate Change?

28th September 2017

TORSTEN BLACKWOOD/AFP/Getty Images

Dr Simon Beard has written a solid piece featured in the Huffington Post answering the question “Should We Care About the Worst-Case Scenario When it Comes to Climate Change?” This piece draws on themes from an ESRC funded workshop on Risk, Uncertainty and Catastrophe Scenarios convened by Simon and Dr Kai Spiekerman. The report from the workshop can be found here.

Back in February 2017 CSER co-hosted Prof Ramanathan for a public lecture on Climate Change, Morphing into an Existential Threat – in August 2017 he published the paper “Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes” that investigates climate change scenarios with effects “beyond catastrophic”.

Summer visitors

22nd September 2017

Over the summer CSER has welcomed a number of interns and visitors who are pursuing independent projects. The following videos highlight the work of three visitors who were supervised by CSER Research Associate Simon Beard.

Natalie Jones is a PhD student in the Law Department at the University of Cambridge. In 2017 she had, in collaboration with the Future of Sentience Society which CSER supports, produced a research paper on political representation for future generations around the world. This included a specific proposal to establish a mechanism for the representation of future generations in the UK via the establishment of an All Party Parliamentary Group within the UK Parliament to represent their interests. The group will be formally established later this year and it is planned for CSER to play an on-going role in supporting its activities in order to increase our engagement with UK policy makers.

Rachel Polfer is an undergraduate at Mount Hollyoak College where she studies both Philosophy and Biology and has previous experience working with GM mosquitos in countering the Zika virus. She has been working on an analysis of different approaches towards the evaluation of biological risks including an assessment of the value of taking a precautionary approach to the evaluation of genetically modified organisms, a comparative assessment of the existential risks posed by synthetic biology and nanotechnology and an introductory report on emerging biological technologies for a general audience, including synthetic biology, CRISPER-Cas9 and gene drives.

 

Andrew Ware is a senior undergraduate at the University of New Hampshire where he studies Philosophy and Economics. Andrew worked on evaluating the distributional implications of AI and Machine Learning, with a special focus on their ability to solve problems of resource scarcity. Andrew’s research involved establishing an extensive network of potential stakeholders across the fields of philosophy, economics, AI, Robotics, Climate Science, Development and Law. His research has already helped to inform CSER’s response to the House of Lords Select Committee on Artificial Intelligence public enquiry on the ethical implications of AI.

When The World Didn’t End

15th September 2017

Simon Beard has a feature in the upcoming October 2017 edition of the BBC History Magazine on historical cases in which humanity narrowly avoided a global catastrophe, mainly due to nuclear war.

He also wrote a blog post for our parent organisation, CRAASH – Less Hollywood, More Car Crash: Putting the USA vs. DPRK nuclear stand-off into historical perspective.

“The more I studied these incidents, the more I concluded that we respond to them in the wrong ways. There is a tendency to see each and every case as a terrible mistake, a one-off freak accident that must never be allowed to happen again. However, when you see so many different incidents lined up side by side you realise that this just true. Nuclear near misses are not unconnected moments of madness, they are a systemic feature of our ability to do such massive damage in such a short period of time.”

“So far 4 different attempts have been made to quantify just how low this probability might be. In 2008 Martin Hellman estimated that the annual probability of a ‘Cuban Missile Type Crisis’ producing a nuclear war was 0.02% to 0.5%. Then, in 2013, Anthony Barrett and colleagues at the Global Catastrophic Risk Institute estimated the annual probability of a nuclear war between the USA and Russia as being somewhere between 0.001% and 7%. Carl Lundgren has estimated that over the past 66 years we have faced an annual probability of nuclear war that was greater than 1% per year. Finally, in 2015, Dennis and Armstrong surveyed expert opinion to estimate that there is an approximately 0.05% chance per year of a nuclear war that had the potential to cause human extinction. There is a lot of variation and uncertainty here, but given the different approaches and methodologies being used, we can say with some confidence that the probability of a nuclear war starting is likely greater than 0.01% per year, and that it could be considerably higher.

What does this mean? Well let’s assume that if there was an international nuclear war, your chances of being killed would be around 10%. If the annual probability of such a war is 0.01%, this gives you a 1 in 10,000 risk of dying in a nuclear war each year, or a 1 in 125 risk of being killed this way over the course of an 80-year lifespan. That is about as high as your risk of dying in a motoring accident (and incidentally means that, even in a country like the USA, people are more likely to be killed by a nuclear warhead than a firearm).

So, the next time you read about rising tensions between two nuclear armed states, remember that this may be an international car crash in more ways than one and you should probably try to worry about things escalating about as much as you would if you saw somebody driving dangerously. Sadly, such incidents aren’t one off tragedies caused by the unique personalities of those involved, they are a regular fact of life and deserve sustained effort to prevent and avoid. As long as states keep hold of their nuclear weapons, this is how things are likely to remain.”

CSER at Effective Altruism Global 2017

31 August 2017

Our Academic Project Manager Haydn Belfield led a workshop at Effective Altruism Global 2017 in San Francisco. The workshop was an interactive table-top scenario exercise exploring the international community’s potential response to a viral outbreak. While in the Bay he also met with collaborators at University of California, Berkeley and Stanford University.

The conference was attended by over 600 people, and included speakers such as Holden Karnofsky, the Executive Director of the Open Philanthropy Project which last year recommended over $100 million in grants, and Tom Kalil, Senior Advisor at the Eric and Wendy Schmidt Group, previously at the White House Office of Science and Technology Policy. Effective Altruism Global is the annual conference of the effective altruism community. Effective altruism is about using evidence and reason to figure out how to benefit others as much as possible, and taking action on that basis. For more information about effective altruism, visit www.effectivealtruism.org.

Participants at the workshop.

Academic Project Manager Haydn Belfield.

CSER team away-day

11 July 2017

Research Associate Simon Beard presents to the team.

The CSER team had a team ‘away-day’ on Monday 10th July to reflect on the previous year-and-a-half of operation and plan for the next year-and-a-half. We discussed what we do well and where we can improve; plans for several collaborative, interdisciplinary, papers; how to prioritise between the opportunities we have as a Centre; and what the next ‘phase’ of CSER should look like.

 

Research Affiliate Martina Kunz makes a point, alongside Catherine Rhodes, Lalitha Sundaram, Julius Weitzdörfer, Yang Liu and Jens Steffensen.

Shahar Avin, Ellen Quigley, Haydn Belfield, Jens Steffensen and Tatsuya Amano listen to Academic Director Huw Price.

 

The CSER team includes (L-R) Gorm Shackelford, Tatsuya Amano, Adrian Currie, Catherine Rhodes, Yang Liu, Simon Beard, Nikita Chiu, Martin Rees, Lalitha Sundaram, Haydn Belfield, Shahar Avin, Martina Kunz, Julius Weitzdörfer and Seán Ó hÉigeartaigh (to say nothing of the dog).