Stuart Russell argues for a new approach to AI risk

08 December 2014

Stuart Russell, Professor of Computer Science at the University of California, Berkeley, author (with Peter Norvig) of Artificial Intelligence: A New Approach, and CSER External Advisor, has warned of risks from Artificial Intelligence, and encouraged researchers to rethink the goals of their research.

Writing on edge.org, described by The Guardian as ‘an internet forum for the world’s most brilliant minds’, Russell noted that while it has not been proved that AI will be the end of the world, “there is no need for a proof, just a convincing argument pointing to a more-than-infinitesimal possibility.” He notes that many unconvincing arguments have been refuted, but claims that the more substantial arguments proposed by Omohundro, Bostrom and others remain largely unchallenged.

Up to now improving decision quality has been the mainstream goal of AI research, an end towards which significant progress has been made in recent years. In Russell’s view AI research has been accelerating rapidly, and senior AI researcher express considerably more optimism over the field’s prospects than was the case even a few years ago, and that, as a result, we should be correspondingly more concerned about the field’s risks.

He dismisses fears that those raising the prospect of AI risk will call for regulation of basic research, an approach which would be misguided and misdirected given the potential benefits of AI for humanity. The right response, he writes, is to change the goals of the field itself from building a pure intelligence, to building an intelligence which is provably aligned with human values. In the same way as containment is seen as an intrinsic part of nuclear fusion research, such an approach would reliably and practically limit the risk of future catastrophe.

Read the entire discussion including contributions from, among others, George Church, Professor of Genetics at Harvard and CSER advisor, and Peter Diamandis, Chairman of the X Prize Foundation here.

Subscribe to our mailing list to get our latest updates