Elon Musk warns of existential risk from AI

This past week, Elon Musk, CEO of Tesla and SpaceX, warned an audience of MIT students of the risks from artificial intelligence.

“I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful with artificial intelligence.

I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish. With artificial intelligence we’re summoning the demon. You know those stories where there’s the guy with the pentagram, and the holy water, and he’s like — Yeah, he’s sure he can control the demon? Doesn’t work out.”

Musk has previously made headlines for explaining that his investments in Vicarious were intended to keep an eye on artificial intelligence risks, and for tweeting about Nick Bostrom’s book Superintelligence. He joined the advisory board of CSER several months ago.

Musk’s recent remarks on artificial intelligence and the remainder of his talk can be viewed here.

Margaret Boden to interview on AI this week

This Tuesday, Professor Margaret Boden, an advisor to CSER, will be interviewed on The Life Scientific on the topic of artificial intelligenceAt 9am, on BBC radio 4, she will discuss the potential of artificial intelligence, as well as the potential insight of a computational approach to understanding the mind. If you can’t catch the segment, it will be made available online shortly after broadcast.

CSER and German Government organise workshop on extreme technological risks

The Centre for the Study of Existential Risk is delighted to partner with the German government in organising a high-level workshop on existential and extreme technological risks, to take place on Friday September 19th.  The meeting will bring together leading German and UK research networks to focus on emerging technological threats, and will be hosted by the German Federal Foreign Office, together with the Ministry of Science and Education and the Ministry of Defence.

Ten of CSER’s leading academics and advisors will take part and present: Lord Martin Rees, Professor Huw Price, Professor William Sutherland, Professor Susan Owens, Mr Jaan Tallinn, Professor Nick Bostrom, Professor Stuart Russell, Professor Tim Lewens, Dr Anders Sandberg and Dr Sean O hEigeartaigh. They will be joined by leading experts from a range of Germany’s research networks, including the Max Planck Society, the Robert Koch Institute, the Center for Artificial Intelligence, the Fraunhofer Institute, the Hemholtz Association, as well as German universities. The attendance will be completed by members of a range of German governmental departments, the UK’s Foreign and Commonwealth Office, and senior representatives of the Volkswagen Foundation.

Topics to be discussed will include approaches for analysing high impact low probability risks from technology, horizon-scanning and foresight methods, policy challenges, and areas of potential synergy or collaboration between research networks. Specific sciences/technologies to be discussed include artificial intelligence, emerging capabilities in biotechnology, and pathogen research.

CSER is very grateful for the support of the German government, and the Federal Foreign Office in particular, in organising and funding the event and the travel of German participants, and for helping to bring this level of expertise to bear on questions of global importance. CSER is also extremely grateful for the financial support of cryptographer and software engineer Paul Crowley, who funded flights and accommodation for CSER academics, and without whose support the workshop could not have taken place.

CSER co-founders Price and Tallinn at the Festival of Dangerous Ideas

Today and tomorrow CSER co-founders Huw Price and Jaan Tallin will  be presenting a series of dangerous ideas – that our continued survival and flourishing as a species is in our hands, that we may be in the most dangerous century of earth’s history, and that the responsibility we owe to future generations is far greater than we may realise. They will speak  to an audience of thousands at the Sydney Opera House in Australia.

For those in the wrong hemisphere, talks and discussions will be available online; CSER will post links:




CSER welcomes Professor Chris Lowe to advisory board

CSER is happy to welcome Professor Chris Lowe to our advisory board.  As Professor of Biotechnology and Director of the Institute of Biotechnology at the University of Cambridge, he is perfectly suited to provide guidance on emerging risks from advanced biotechnologies.

Two Church papers on gene drive technologies

This week, CSER advisor George Church has co-authored two papers on gene drive technologies, renewing interest in their risks and benefits. First proposed ten years ago, gene drive technology stimulates a gene to be preferentially inherited so that it can spread through the population.

Although gene drives are not yet able to be implemented, they are coming closer to reality. In their technical report, Esvelt, Smidler, Catteruccia and Church report that the technology is being accelerated by CRISPR-Cas9, which allows us to edit genomes. Church reports that Esvelt is already applying the CRISPR-generated gene drive experiments in yeast, nematodes, and mosquitoes. They state that gene drives could be used to assist in the eradication of insect-borne diseases, for example, reducing mosquito populations to prevent them from transmitting malaria.

However, gene drives might also carry substantial risk. In their editorial, Church and nine other scientists report that gene drives may pose substantial risks to wild organisms, crops and lifestock. They argue that although US security policies have broad concerns, they are narrow in the scope of their oversight, focusing on weapons, pathogens and toxins. They argue for defining risk in terms of the ability of biotechnologies to cause harm to humans and other species of interest.

They conclude:

For emerging technologies that affect the global commons, concepts and applications should be published in advance of construction, testing, and release. This lead time enables public discussion of environmental and security concerns, research into areas of uncertainty, and development and testing of safety features. It allows adaptation of regulations and conventions in light of emerging information on benefits, risks, and policy gaps. Most important, in the case of gene drives, lead time will allow for broadly inclusive and well informed public discussion to determine when and how gene driver should be used. 

Professor Tim Palmer added to CSER’s advisory board

Professor Tim Palmer has been newly added to CSER’s advisory board. Palmer is Royal Society Anniversary Research Professor at the University of Oxford, is a Senior Scientist at the European Centre for Medium-Range Weather Forecasts and was President of the Royal Meteorological Society from 2011-12. This year, he was awarded the Dirac Medal and Prize by the Institute of Physics for his work on climate models. He has performed extensive research on forecast uncertainty, the propagation of errors, and the relative merit of ensemble and deterministic forecasting.

Interestingly, the first Dirac Medal from the Institute of Physics was awarded to another CSER advisor, Professor Stephen Hawking, in 1987.

Prof David Spiegelhalter knighted

Professor David Spiegelhalter was knighted in the queens birthday ceremony last weekend. Known to the public as Professor Risk, Spiegelhalter was honoured for his services to statistics.

Sir David has been the Winton Professor of the Public Understanding of Risk at the University of Cambridge since October 2007, and is an advisor to the Cambridge Centre for Study of Existential Risk. He was elected a Fellow of the Royal Society of London in 2005 and awarded an OBE in 2006 for services to medical statistics.

“Statistics is not very sexy to be honest, so I’m very honoured and gratified that somebody thinks statistics is worth working on, so I’m very pleased for myself and the field”, he said.

There were also knighthoods for David Greenaway and David Eastwood, neuroscientist Colin Blakemore, psychologist Cary Cooper, historian Thomas Devine and theoretical physicist Thomas Bannerman.