Robin Hanson has written a characteristically interesting article about the possibility of human extinction. And with characteristic understatement, Hanson notes that “[a] disaster large enough to kill off humanity … should be of special concern.” Indeed. Hanson’s point, of course, is that wiping out all of humanity is much worse than wiping out almost all of humanity.
Hanson appears to worry about extinction in part because, he observes, disasters sometimes follow a power law distribution in the destruction that they cause. This suggests that in expected value terms, we should perhaps worry as much or more about disasters that wipe out humanity as about disasters on the scope observed in past human history.
Extrapolating from some assumptions, Hanson suggests maybe there is a one in three million chance per year of an event that would kill everyone (perhaps not instantaneously, but in a gradual collapse, as the failure of some social systems lead to the failure of others). My inclination is to agree with Hanson that the danger of extinction is sufficiently severe that it’s worth worrying about. But I wouldn’t rely too much on extending a power law distribution beyond previously observed events. That may actually understate the danger of extinction, which seems to me to be very low but a lot higher than one in three million. In part, this is because modern technology creates many scenarios for catastrophe (nuclear war, superviruses, nanotech gray goo, the possibility that the Knicks could keep Isiah Thomas after this year) that could not have occurred hundreds of years ago.
One argument that Hanson makes is that it might be useful to establish refuges to ensure that if a disaster occurred, at least some small number of people (perhaps 100) would survive. Eventually, these people could return to a hunter-gatherer lifestyle, eventually develop the capacity to communicate innovations, and then within maybe a mere twenty thousand years, a blip of cosmic time, returns to where we are now. While there is value in such an approach, government policy might not make a big difference. If some calamity is strong enough to prevent the survivalists from surviving, it would seem hard to believe that government-produced sanctuaries would do much better. It seems a narrow class of extinction events that would kill even those who, from the perspective of most observers, have absurdly exaggerated estimates of the probability of catastrophe while sparing a government project.