Tom, I don’t take the risk seriously. Richard Hollerith said well why. I don’t think people who just want to do something randomly destructive are good at the kind of long-range planning and collaboration needed to be seriously threatening, and if they were, they wouldn’t need us to give them extremely general ideas.
I was reminded of something Michael Vassar said on SL4 (emphasis mine):
For all of our arrogance, most Transhumanists grossly overestimate the abilities of ordinary humans. This is substantially a consequence of how folk psychology works, and fails to work for outliers, but also a consequence of typically limited and iscolated life experience. Unfortunately, it has serious consequences when predicting the future. Our estimates of the likely behavior of large scale groups, the effort that will be devoted to a particular research objective, or the time until some task is accomplished are all grossly distorted. For many transhumanists this means that boogie men such as “terrorists” are imagined as something that never was, disutility maximizers, and the resultant threats of bioterror and nanoterror are overestimated by many orders of magnitude. For almost all transhumanists this means an underestimation of inertia, leading Chris Phoenix’s fears of pre-emptory arms races and Nick Bostrom’s utopian dreams of world government and benign regulation of dangerous tech.
Tom, I don’t take the risk seriously. Richard Hollerith said well why. I don’t think people who just want to do something randomly destructive are good at the kind of long-range planning and collaboration needed to be seriously threatening, and if they were, they wouldn’t need us to give them extremely general ideas.
I was reminded of something Michael Vassar said on SL4 (emphasis mine):