I think it’s fairly clear from this that he doesn’t have solomonoff induction internalized, he doesn’t know how many of his objection to bayesian metaphysics it answers.
I suspect, for DD, it’s not about *how many* but *all*. If I come up with 10 reasons Bayesianism is wrong (so 10 criticisms), and 9 of those get answered adequately, the 1 that’s still left is as bad as the 10; *any* unanswered criticism is a reason not to believe an idea. So to convince DD (or any decent Popperian) that an idea is wrong can’t rely on incomplete rebuttals, an idea needs to be *uncriticised* (answered criticisms don’t count here, though those answers could be criticised; that entire chain can be long and all of it needs to be resolved). There are also ideas answering questions like “what happens when you get to an ‘I don’t know’ point?” or “what happens with two competing ideas, both of which are uncriticised?”
Clarifying point: some ideas (like MWI, string theory, etc) are very difficult to criticise by showing a contradiction with evidence, but the fact 2 competing ideas exist means they’re either compatible in a way we don’t realise or they offer some criticisms of each other, even if we can’t easily judge the quality of those criticisms at the time.
Note: I’m not a Bayesian; DD’s book *The Beginning of Infinity* convinced me that Popper’s foundation for epistemology (including the ideas built on top of it / improved it) was better in a decisive way.
Note: I’m not a Bayesian; DD’s book *The Beginning of Infinity* convinced me that Popper’s foundation for epistemology (including the ideas built on top of it / improved it) was better in a decisive way
In what way are the epistemologies actually in conflict?
My impression is that it is more just a case of two groups of people who maybe don’t understand each other well enough, rather than a case of substantiative disagreement between the useful theories that they have, regardless of what DD thinks it is.
Bayes does not disagree with true things, nor does it disagree with useful rules of thumb. Whatever it is you have, I think it will be conceivable from bayesian epistemological primitives, and conceiving it in those primitives will give you a clearer idea of what it really is.
In what way are the epistemologies actually in conflict?
Well, they disagree on how to judge ideas, and why ideas are okay to treat as ‘true’ or not.
There are practical consequences to this disagreement; some of the best CR thinkers claim MIRI are making mistakes that are detrimental to the future of humanity+AGI, for **epistemic** reasons no less.
My impression is that it is more just a case of two groups of people who maybe don’t understand each other well enough, rather than a case of substantiative disagreement between the useful theories that they have, regardless of what DD thinks it is.
I have a sense of something like this, too, both in the way LW and CR “read” each other, and in the more practical sense of agreement in the outcome of many applications.
I do still think there is a substantive disagreement, though. I also think DD is one of the best thinkers wrt CR and broadly endorse ~everything in BoI (there are a few caveats, a typo and improvements to how-to-vary, at least; I’ll mention if more come up. The yes/no stuff I mentioned in another post is an example of one of these caveats). I mention endorsing BoI b/c if you wanted to quote something from BoI it’s highly likely I wouldn’t have an issue with it (so is a good source of things for critical discussion).
Bayes does not disagree with true things, nor does it disagree with useful rules of thumb.
CR agrees here, though there is a good explanation of “rules of thumb” in BoI that covers how, when, and why rules of thumb can be dangerous and/or wrong.
Whatever it is you have, I think it will be conceivable from bayesian epistemological primitives, and conceiving it in those primitives will give you a clearer idea of what it really is.
This might be a good way try to find disagreements between BE (Bayesian Epistemology) and CR in more detail. It also tests my understanding of CR (and maybe a bit of BE too).
I’ve given some details on the sorts of principles in CR in my replies^1, if you’d like to try this do you have any ideas on where to go next? I’m happy to provide more detail with some prompting about the things you take issue with or you think need more explanation / answering criticisms.
[1]: or, at least my sub-school of thought; some of the things I’ve said are actually controversial within CR, but I’m not sure they’ll be significant.
I suspect, for DD, it’s not about *how many* but *all*. If I come up with 10 reasons Bayesianism is wrong (so 10 criticisms), and 9 of those get answered adequately, the 1 that’s still left is as bad as the 10; *any* unanswered criticism is a reason not to believe an idea. So to convince DD (or any decent Popperian) that an idea is wrong can’t rely on incomplete rebuttals, an idea needs to be *uncriticised* (answered criticisms don’t count here, though those answers could be criticised; that entire chain can be long and all of it needs to be resolved). There are also ideas answering questions like “what happens when you get to an ‘I don’t know’ point?” or “what happens with two competing ideas, both of which are uncriticised?”
Clarifying point: some ideas (like MWI, string theory, etc) are very difficult to criticise by showing a contradiction with evidence, but the fact 2 competing ideas exist means they’re either compatible in a way we don’t realise or they offer some criticisms of each other, even if we can’t easily judge the quality of those criticisms at the time.
Note: I’m not a Bayesian; DD’s book *The Beginning of Infinity* convinced me that Popper’s foundation for epistemology (including the ideas built on top of it / improved it) was better in a decisive way.
In what way are the epistemologies actually in conflict?
My impression is that it is more just a case of two groups of people who maybe don’t understand each other well enough, rather than a case of substantiative disagreement between the useful theories that they have, regardless of what DD thinks it is.
Bayes does not disagree with true things, nor does it disagree with useful rules of thumb. Whatever it is you have, I think it will be conceivable from bayesian epistemological primitives, and conceiving it in those primitives will give you a clearer idea of what it really is.
Well, they disagree on how to judge ideas, and why ideas are okay to treat as ‘true’ or not.
There are practical consequences to this disagreement; some of the best CR thinkers claim MIRI are making mistakes that are detrimental to the future of humanity+AGI, for **epistemic** reasons no less.
I have a sense of something like this, too, both in the way LW and CR “read” each other, and in the more practical sense of agreement in the outcome of many applications.
I do still think there is a substantive disagreement, though. I also think DD is one of the best thinkers wrt CR and broadly endorse ~everything in BoI (there are a few caveats, a typo and improvements to how-to-vary, at least; I’ll mention if more come up. The yes/no stuff I mentioned in another post is an example of one of these caveats). I mention endorsing BoI b/c if you wanted to quote something from BoI it’s highly likely I wouldn’t have an issue with it (so is a good source of things for critical discussion).
CR agrees here, though there is a good explanation of “rules of thumb” in BoI that covers how, when, and why rules of thumb can be dangerous and/or wrong.
This might be a good way try to find disagreements between BE (Bayesian Epistemology) and CR in more detail. It also tests my understanding of CR (and maybe a bit of BE too).
I’ve given some details on the sorts of principles in CR in my replies^1, if you’d like to try this do you have any ideas on where to go next? I’m happy to provide more detail with some prompting about the things you take issue with or you think need more explanation / answering criticisms.
[1]: or, at least my sub-school of thought; some of the things I’ve said are actually controversial within CR, but I’m not sure they’ll be significant.