The Future of Biotech Enterprise: Exponential Opportunities and Existential Risks

On Wendesday 2nd, CSER (Managing Extreme Technological Risks), Cambridge University Entrepreneurs and the Masters for Bioscience Enterprise are partnering to host “The Future of Biotech Enterprise: Exponential Opportunities and Existential Risks”.

Speakers include CSER adviser Prof Chris Lowe, biotechnology investor Dmitry Kaminski and Prof Derek Smith, who spoke on gain-of-function influenza research at a CSER lecture earlier this year.

“Bioscience technologies have the power to build or destroy a world of abundance. Leveraging entrepreneurial opportunities whilst avoiding catastrophic risk is a balancing act with potentially fatal consequences.”
Attendance is free and open to all, please register below.
https://www.eventbrite.com/e/the-future-of-biotech-enterprise-exponential-opportunities-and-existential-risks-tickets-19638230476

Venue: The Queen’s Lecture Theatre, Emmanuel College, Cambridge
Date: Wednesday 2nd December
Time: 15:30-17:00

CSER Public lecture: Jane Heal on Pushing the Limits, November 20th

The next CSER public lecture will take place on Friday November 20th at 5pm, and will be given by Professor Jane Heal (Philosophy, Cambridge).

What do theory of evolution, intellectual history and philosophy tell us about what we human beings are like? And what resources – intellectual, emotional, moral – we can muster for dealing with the existential risks of our current situation? The talk will offer a speculative overview of these topics, which set the scene for the challenging issues CSER faces.

http://cser.org/event/pushing-the-limits-public-lecture-with-professor-jane-heal/

Tickets available here (free):
https://www.eventbrite.co.uk/e/cser-seminar-series-public-lecture-with-professor-jane-heal-tickets-19207983596

Four new positions at the Centre for the Study of Existential Risk

The Centre for the Study of Existential Risk is delighted to announce four new postdoctoral positions for the subprojects below, to begin in January 2016 or as soon as possible afterwards. The research associates will join a growing team of researchers developing a general methodology for the management of extreme technological risk.

Evaluation of extreme technological risk will examine issues such as:
The use and limitations of approaches such as cost-benefit analysis when evaluating extreme technological risk; the importance of mitigating extreme technological risk compared to other global priorities; issues in population ethics as they relate to future generations; challenges associated with evaluating small probabilities of large payoffs; challenges associated with moral and evaluative uncertainty as they relate to the long-term future of humanity.
Relevant disciplines include philosophy and economics, although suitable candidates outside these fields are welcomed.
Evaluation of extreme technological risk

Extreme risk and the culture of science will explore the hypothesis that the culture of science is in some ways ill-adapted to successful long-term management of extreme technological risk, and investigate the option of ‘tweaking’ scientific practice, so as to improve its suitability for this special task. It will examine topics including inductive risk, use and limitations of the precautionary principle, and the case for scientific pluralism and ‘breakout thinking’ where extreme technological risk is concerned. Relevant disciplines include philosophy of science and science and technology studies, although suitable candidates outside these fields are welcomed.
Extreme risk and the culture of science

Responsible innovation and extreme technological risk asks what can be done to encourage risk-awareness and societal responsibility, without discouraging innovation, within the communities developing future technologies with transformative potential. What can be learned from historical examples of technology governance and culture-development? What are the roles of different forms of regulation in the development of transformative technologies with risk potential? Relevant disciplines include science and technology studies, geography, sociology, governance, philosophy of science, plus relevant technological fields (e.g., AI, biotechnology, geoengineering), although suitable candidates outside these fields are welcomed.
Responsible innovation and extreme technological risk

We are also seeking to appoint an academic project manager, who will play a central role in developing CSER into a world-class research centre. We seek an ambitious candidate with initiative and a broad intellectual range for a postdoctoral role combining academic and administrative responsibilities. The Academic Project Manager will co-ordinate and develop CSER’s projects and the Centre’s overall profile, and build and maintain collaborations with academic centres, industry leaders and policy makers in the UK and worldwide. This is a unique opportunity to play a formative research development role in the establishment of a world-class centre.
CSER Academic Project Manager

Candidates will normally have a PhD in a relevant field or an equivalent level of experience and accomplishment (for example, in a policy, industry, or think tank setting). Application Deadline: Midday (12:00) on November 12th 2015.

The Vulnerability of Man

CSER’s Jaan Tallinn, Professor Sir John Beddington (Senior Advisor, Oxford Martin School & the UK Government’s former Chief Scientific Adviser) and Sir Crispin Tickell (former diplomat and advisor to successive UK Prime Ministers, who is regarded as the world’s foremost authority on climate change and environmental issues) speak to Vikas Shah at Thought Economics  on existential risk and the vulnerability of our species.

http://thoughteconomics.com/the-vulnerability-of-man/

Researchers Urge UN to Ban Autonomous Weapons

Over 1000 researchers working in the field of Artificial Intelligence and Robotics have signed an open letter to the United Nations urging that the development and use of autonomous weapons be banned.

Presented at the International Joint Conference on Artificial Intelligence in Buenos Aires, the letter includes signatures from CSER’s co-founders Jaan Tallinn and Huw Price as well as CSER advisors Stephen Hawking, Elon Musk and Stuart Russell.

It states that whilst “AI has great potential to benefit humanity in many ways” that “Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control”.

The complete open letter and list of signatories can be read here.

Climate Change: A Risk Assessment

CSER’s Professor Sir Partha Dasgupta and Lord Martin Rees have both made expert contributions to a recently released risk assessment report on climate change.

Commissioned by the UK Foreign and Commonwealth Office and edited and produced by the Centre for Science and Policy (CSaP) at the University of Cambridge the report has been compiled as an independent contribution to the climate change debate.  It “argues that the risks of climate change should be assessed in the same way as risks to national security, financial stability, or public health. That means we should concentrate especially on understanding what is the worst that could happen, and how likely that might be”.

Read the full report here.

£1 million grant for research into long-term impacts of artificial intelligence

The Centre for the Study of Existential Risk and the Future of Humanity Institute, part of the Oxford Martin School, Oxford will together receive a £1m grant for policy and technical research into the development of machine intelligence.

The Technical Abstract submitted reads “The center will focus explicitly on the long-term impacts of AI, the strategic implications of powerful AI systems as they come to exceed human capabilities in most domains of interest, and the policy responses that could best be used to mitigate the potential risks of this technology.”

Funded by the Open Philanthropy Project and Elon Musk, the grant is from the Future of Life Institute in Boston, USA.

http://www.oxfordmartin.ox.ac.uk/news/201507_FLI