“Hawking's rationale was that humankind would eventually fall victim to an extinction-level catastrophe - perhaps sooner rather than later. What worried him were so-called low-probability, high impact events - a large asteroid striking our planet is the classic example. But Hawking perceived a host of other potential threats: artificial intelligence, climate change, GM viruses and nuclear war to name a few.
In 2016, he told the BBC: "Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or 10,000 years.”
Ill take nuclear deterrence any day over that hell.