Note: I’m not a Bayesian; DD’s book *The Beginning of Infinity* convinced me that Popper’s foundation for epistemology (including the ideas built on top of it / improved it) was better in a decisive way
In what way are the epistemologies actually in conflict?
My impression is that it is more just a case of two groups of people who maybe don’t understand each other well enough, rather than a case of substantiative disagreement between the useful theories that they have, regardless of what DD thinks it is.
Bayes does not disagree with true things, nor does it disagree with useful rules of thumb. Whatever it is you have, I think it will be conceivable from bayesian epistemological primitives, and conceiving it in those primitives will give you a clearer idea of what it really is.
In what way are the epistemologies actually in conflict?
Well, they disagree on how to judge ideas, and why ideas are okay to treat as ‘true’ or not.
There are practical consequences to this disagreement; some of the best CR thinkers claim MIRI are making mistakes that are detrimental to the future of humanity+AGI, for **epistemic** reasons no less.
My impression is that it is more just a case of two groups of people who maybe don’t understand each other well enough, rather than a case of substantiative disagreement between the useful theories that they have, regardless of what DD thinks it is.
I have a sense of something like this, too, both in the way LW and CR “read” each other, and in the more practical sense of agreement in the outcome of many applications.
I do still think there is a substantive disagreement, though. I also think DD is one of the best thinkers wrt CR and broadly endorse ~everything in BoI (there are a few caveats, a typo and improvements to how-to-vary, at least; I’ll mention if more come up. The yes/no stuff I mentioned in another post is an example of one of these caveats). I mention endorsing BoI b/c if you wanted to quote something from BoI it’s highly likely I wouldn’t have an issue with it (so is a good source of things for critical discussion).
Bayes does not disagree with true things, nor does it disagree with useful rules of thumb.
CR agrees here, though there is a good explanation of “rules of thumb” in BoI that covers how, when, and why rules of thumb can be dangerous and/or wrong.
Whatever it is you have, I think it will be conceivable from bayesian epistemological primitives, and conceiving it in those primitives will give you a clearer idea of what it really is.
This might be a good way try to find disagreements between BE (Bayesian Epistemology) and CR in more detail. It also tests my understanding of CR (and maybe a bit of BE too).
I’ve given some details on the sorts of principles in CR in my replies^1, if you’d like to try this do you have any ideas on where to go next? I’m happy to provide more detail with some prompting about the things you take issue with or you think need more explanation / answering criticisms.
[1]: or, at least my sub-school of thought; some of the things I’ve said are actually controversial within CR, but I’m not sure they’ll be significant.
In what way are the epistemologies actually in conflict?
My impression is that it is more just a case of two groups of people who maybe don’t understand each other well enough, rather than a case of substantiative disagreement between the useful theories that they have, regardless of what DD thinks it is.
Bayes does not disagree with true things, nor does it disagree with useful rules of thumb. Whatever it is you have, I think it will be conceivable from bayesian epistemological primitives, and conceiving it in those primitives will give you a clearer idea of what it really is.
Well, they disagree on how to judge ideas, and why ideas are okay to treat as ‘true’ or not.
There are practical consequences to this disagreement; some of the best CR thinkers claim MIRI are making mistakes that are detrimental to the future of humanity+AGI, for **epistemic** reasons no less.
I have a sense of something like this, too, both in the way LW and CR “read” each other, and in the more practical sense of agreement in the outcome of many applications.
I do still think there is a substantive disagreement, though. I also think DD is one of the best thinkers wrt CR and broadly endorse ~everything in BoI (there are a few caveats, a typo and improvements to how-to-vary, at least; I’ll mention if more come up. The yes/no stuff I mentioned in another post is an example of one of these caveats). I mention endorsing BoI b/c if you wanted to quote something from BoI it’s highly likely I wouldn’t have an issue with it (so is a good source of things for critical discussion).
CR agrees here, though there is a good explanation of “rules of thumb” in BoI that covers how, when, and why rules of thumb can be dangerous and/or wrong.
This might be a good way try to find disagreements between BE (Bayesian Epistemology) and CR in more detail. It also tests my understanding of CR (and maybe a bit of BE too).
I’ve given some details on the sorts of principles in CR in my replies^1, if you’d like to try this do you have any ideas on where to go next? I’m happy to provide more detail with some prompting about the things you take issue with or you think need more explanation / answering criticisms.
[1]: or, at least my sub-school of thought; some of the things I’ve said are actually controversial within CR, but I’m not sure they’ll be significant.