Sarah Knapton, the Telegraph Science Editor, covered our new report:
'Researchers from Cambridge’s Centre for the Study of Existential Risk (CSER) said the government was failing to prepare for ‘human-driven catastrophic risks’ that could lead to mass harm and societal collapse. In recent years advances in science such as genetic engineering, and artificial intelligence (AI) have opened the door to a host of new threats. In a new report, experts called on policy-makers to ‘protect their citizens’ and start preparing for events such as a devastating bioengineered pandemic or programmers losing control of AI systems.
Former Defence Secretary Des Browne, said: “Our leaders can, and must, act now to better understand the global catastrophic risks that are present and developing. National governments struggle with understanding and developing policy for the elimination or mitigation of extreme risks, including global catastrophic risks. Effective policies may compel fundamental structural reform of political systems, but we do not need, nor do we have the time, to wait for such change."
The report also calls for an independent review of extreme risks to Britain and the world, and a review of national strategies currently in place to deal with emerging threats.'