Thanks for the reply Dave. Are you saying I should not look at falsifiability as a belief, but rather a tool of some sort? That distinction sounds interesting but is not 100% clear to me. Perhaps someone should do a larger post about why the principle should not be applied to itself.
I have also thought of putting the problem this way: Eliezer states that the only ideas worth having are the ones we would be willing to give up. Is he willing to give up that idea? I don’t think so..., and I would be really interested to know why he doesn’t believe this to be a contradiction.
What I’m saying is that the important thing is what I can do with my beliefs. If the “principle of falsifiability” does some valuable thing X, then in worlds where the PoF doesn’t do X, I should be willing to discard it. If the PoF doesn’t do any valuable thing X, then I should be willing to discard it in this world.
It seems we have empirical and non-empirical beliefs that can both be rational, but what we mean by “rational” has a different sense in each case. We call empirical beliefs “rational” when we have good evidence for them, we call non-empirical beliefs like the PoF “rational” when we find that they have a high utility value, meaning there is a lot we can do with the principle (it excludes maps that can’t conform to any territory).
To answer my original question, it seems a consequence of this is that the PoF doesn’t apply to itself, as it is a principle that is meant for empirical beliefs only. Because the PoF is a different kind of belief from an empirical belief, it need not be falsifiable, only more useful than our current alternatives.
What do you think about that?
If it can be restated as “I will on average be more effective at achieving my goals if I only adopting falsifiable beliefs,” for example, then it is equivalent to an empirical belief (and is, incidentally, falsifiable).
If it can be restated as “I should only adopt falsifiable beliefs, whether doing so gets me anything I want or not” then there exists no empirical belief to which it is equivalent (and is, incidentally, worth discarding).
Thanks for the reply Dave. Are you saying I should not look at falsifiability as a belief, but rather a tool of some sort? That distinction sounds interesting but is not 100% clear to me. Perhaps someone should do a larger post about why the principle should not be applied to itself.
I have also thought of putting the problem this way: Eliezer states that the only ideas worth having are the ones we would be willing to give up. Is he willing to give up that idea? I don’t think so..., and I would be really interested to know why he doesn’t believe this to be a contradiction.
What I’m saying is that the important thing is what I can do with my beliefs. If the “principle of falsifiability” does some valuable thing X, then in worlds where the PoF doesn’t do X, I should be willing to discard it. If the PoF doesn’t do any valuable thing X, then I should be willing to discard it in this world.
It seems we have empirical and non-empirical beliefs that can both be rational, but what we mean by “rational” has a different sense in each case. We call empirical beliefs “rational” when we have good evidence for them, we call non-empirical beliefs like the PoF “rational” when we find that they have a high utility value, meaning there is a lot we can do with the principle (it excludes maps that can’t conform to any territory).
To answer my original question, it seems a consequence of this is that the PoF doesn’t apply to itself, as it is a principle that is meant for empirical beliefs only. Because the PoF is a different kind of belief from an empirical belief, it need not be falsifiable, only more useful than our current alternatives. What do you think about that?
I think it depends on what the PoF actually is.
If it can be restated as “I will on average be more effective at achieving my goals if I only adopting falsifiable beliefs,” for example, then it is equivalent to an empirical belief (and is, incidentally, falsifiable).
If it can be restated as “I should only adopt falsifiable beliefs, whether doing so gets me anything I want or not” then there exists no empirical belief to which it is equivalent (and is, incidentally, worth discarding).