It’s like the people who congregate on top of a hill waiting for the angels or the flying saucers to take them up to heaven. They just go “well our date was wrong, but that doesn’t mean it’s not going to happen, of course it is, Real Soon Now.”
That the Singularity concept pattern-matches doomsday cults is nothing new to anyone here. You looked further into it and declared it false, wedrifid and others looked into it and declared it possible. The discussion is now about evidence between those two points of view. Repeating that it looks like a doomsday cult is taking a step backwards, back to where we came to this discussion from.
rwallace’s argument isn’t centering on the standard argument that makes it look like a doomsday cult. He’s focusing on an apparent repetition of predictions while failing to update when those predictions have failed. That’s different than the standard claim about why Singularitarianism pattern matches with doomsday cults, and should, to a Bayesian, be fairly disturbing if he is correct about such a history.
Fair enough. I guess his rant pattern-matched the usual anti-doomsday-cult stuff I see involving the singularity. Keep in mind that, as a Bayesian, it is possible to adjust the value of those people making the predictions instead of the likelihood of the event. Certainly, that is what I have done; I care less for predictions, even from people I trust to reason well, because a history of failing predictions has taught me not that predicted events don’t happen, but rather that predictions are full of crap. This has the converse effect of greatly reducing the value of (in hindsight) correct predictions; which seems to be a pretty common failure mode for a lot of belief mechanisms: that a correct prediction alone is enough evidence. I would require the process by which the prediction was produced to consistently predict correctly.
That the Singularity concept pattern-matches doomsday cults is nothing new to anyone here. You looked further into it and declared it false, wedrifid and others looked into it and declared it possible. The discussion is now about evidence between those two points of view. Repeating that it looks like a doomsday cult is taking a step backwards, back to where we came to this discussion from.
rwallace’s argument isn’t centering on the standard argument that makes it look like a doomsday cult. He’s focusing on an apparent repetition of predictions while failing to update when those predictions have failed. That’s different than the standard claim about why Singularitarianism pattern matches with doomsday cults, and should, to a Bayesian, be fairly disturbing if he is correct about such a history.
Fair enough. I guess his rant pattern-matched the usual anti-doomsday-cult stuff I see involving the singularity. Keep in mind that, as a Bayesian, it is possible to adjust the value of those people making the predictions instead of the likelihood of the event. Certainly, that is what I have done; I care less for predictions, even from people I trust to reason well, because a history of failing predictions has taught me not that predicted events don’t happen, but rather that predictions are full of crap. This has the converse effect of greatly reducing the value of (in hindsight) correct predictions; which seems to be a pretty common failure mode for a lot of belief mechanisms: that a correct prediction alone is enough evidence. I would require the process by which the prediction was produced to consistently predict correctly.