I agree with your first paragraph, though in the interests of authorial intent, I’d like to stress that I don’t think that Dawkins subscribes to Bayesianism and I don’t think that The God Delusion has anything to do with Bayes. I was saying, ‘this is about as close as he gets to Bayesianism and he’s not quite there, which is a pity because he would have made for a good advocate. The best you could say is that he’s tacitly using similar logic in certain places, one example being the seven point scale.’
He does mention Bayes elsewhere in the book, so he knows what it is. But there’s certainly little positive evidence for him having adopted Bayesian epistemology as a personal philosophy, which is rather more than knowing an equation.
But there’s certainly little positive evidence for him having adopted Bayesian epistemology as a personal philosophy, which is rather more than knowing an equation.
So what do you think how much it would improve Dawkins general rationality to adapt it as a personal philosophy? Or would he have to read the sequences as well?
How much below the standards of Less Wrong do you think he currently is?
Well, I think that by any reasonable standards of “winning”, Richard Dawkins beats just about everyone here, whether he agrees sufficiently with local tropes or not. So it’s not clear I’m qualified to answer.
Well, I think that by any reasonable standards of “winning”, Richard Dawkins beats just about everyone here, whether he agrees sufficiently with local tropes or not.
I agree. And that also means that whether he mentions Bayesian epistemology or not is not the best question you can ask. A better question would be, how could he have improved his output, or improve his future output, by mentioning or adopting it? The same could be asked about the Sequences.
...it’s not clear I’m qualified to answer.
I think that a strong SI advocate would have to say that Dawkins is not “winning” because he did not conclude that risks from AI are the most important issue one could care about, even though he knows about the possibility of superhuman AI. So he must be below the level of SI standards. And by reading the Sequences he could learn not to waste his time with less important issues anymore ;-)
I agree with your first paragraph, though in the interests of authorial intent, I’d like to stress that I don’t think that Dawkins subscribes to Bayesianism and I don’t think that The God Delusion has anything to do with Bayes. I was saying, ‘this is about as close as he gets to Bayesianism and he’s not quite there, which is a pity because he would have made for a good advocate. The best you could say is that he’s tacitly using similar logic in certain places, one example being the seven point scale.’
He does mention Bayes elsewhere in the book, so he knows what it is. But there’s certainly little positive evidence for him having adopted Bayesian epistemology as a personal philosophy, which is rather more than knowing an equation.
So what do you think how much it would improve Dawkins general rationality to adapt it as a personal philosophy? Or would he have to read the sequences as well?
How much below the standards of Less Wrong do you think he currently is?
Here is Dawkins on AI and the Singularity.
Well, I think that by any reasonable standards of “winning”, Richard Dawkins beats just about everyone here, whether he agrees sufficiently with local tropes or not. So it’s not clear I’m qualified to answer.
I agree. And that also means that whether he mentions Bayesian epistemology or not is not the best question you can ask. A better question would be, how could he have improved his output, or improve his future output, by mentioning or adopting it? The same could be asked about the Sequences.
I think that a strong SI advocate would have to say that Dawkins is not “winning” because he did not conclude that risks from AI are the most important issue one could care about, even though he knows about the possibility of superhuman AI. So he must be below the level of SI standards. And by reading the Sequences he could learn not to waste his time with less important issues anymore ;-)
Yes, I agree.