There are obscurantists who wear their obscurantism as attire, proudly claiming that it is impossible to know whether God exists. It can be said, perhaps, that such an obscurantist has a preference for not knowing the answer to the question, for never storing a belief of “the God does (not) exist” in his brain. But still all the obscurantist’s decisions are the same as if he believed that there is no God—the obscurantist belief bears no influence on other preferences. In such a case, you may well argue that the extrapolated volition of the obscurantist is to act as if he knew the answer and therefore the obscurantist beliefs are shattered. But this is also true for his non-extrapolated volition. If the non-extrapolated volition already ignores the obscurantist belief and can coexist with it, why is this possibility excluded for the extrapolated volition? Because of the “coherent” part? Does coherence of volition require that one is not mistaken about one’s actual desires? (This is a honest question; I think that “volition” refers to the set of desires, which is to be made coherent by extrapolation in case of CEV, but that it doesn’t refer to beliefs about the desires. But I haven’t been interested in CEV that much and may be mistaken about this.)
The more interesting case is an obscurantist who holds obscurantism as a worldview with real consequences. Talking about things that are plausible (I am not sure whether this kind of obscurantists exist in non-negligible numbers), imagine a woman who holds that the efficacy of homoeopathics can never be established with any reasonable certainty. Now she may get cancer and have two possibilities for treatment: a conventional, with 10% chance of success, and a homoeopathic, with 0.1% chance (equal to that of a spontaneous remission). But, in accordance with her obscurantism, she believes that assigning anything except 50% for homoeopathy working would mean that we know the answer here, and since we can’t know, homoeopathy has indeed success chance of 50%.
Acting on these beliefs, she decides for the homoeopathic treatment. One of her desires is to survive, which leads to choosing the conventional treatment upon extrapolation, thus creating conflict with the actual decision. But isn’t it plausible that her another desire, namely to ever decide as if the chance of homoeopathy working were 50%, is enough strong to survive the extrapolation and take precedence upon the desire to survive? People have died for their beliefs many times.
Edit: I find it mildly annoying when, answering a comment or post, people point out obvious things whose relevance to the comment / post is dubious without further explanation. If you think that the non-equivalence of the mentioned beliefs somehow inplies the impossibility to extrapolate obscurantist values, please elaborate. If you just thought that I might have commited a sloppy inference and it would be cool to correct me on it, please don’t do that. It (1) derails the discussion to issues of uninteresting nitpickery and (2) motivates the commenters to clutter their comments with disclaimers in order to avoid being suspected of sloppy reasoning.
There are obscurantists who wear their obscurantism as attire, proudly claiming that it is impossible to know whether God exists. It can be said, perhaps, that such an obscurantist has a preference for not knowing the answer to the question, for never storing a belief of “the God does (not) exist” in his brain. But still all the obscurantist’s decisions are the same as if he believed that there is no God—the obscurantist belief bears no influence on other preferences. In such a case, you may well argue that the extrapolated volition of the obscurantist is to act as if he knew the answer and therefore the obscurantist beliefs are shattered. But this is also true for his non-extrapolated volition. If the non-extrapolated volition already ignores the obscurantist belief and can coexist with it, why is this possibility excluded for the extrapolated volition? Because of the “coherent” part? Does coherence of volition require that one is not mistaken about one’s actual desires? (This is a honest question; I think that “volition” refers to the set of desires, which is to be made coherent by extrapolation in case of CEV, but that it doesn’t refer to beliefs about the desires. But I haven’t been interested in CEV that much and may be mistaken about this.)
The more interesting case is an obscurantist who holds obscurantism as a worldview with real consequences. Talking about things that are plausible (I am not sure whether this kind of obscurantists exist in non-negligible numbers), imagine a woman who holds that the efficacy of homoeopathics can never be established with any reasonable certainty. Now she may get cancer and have two possibilities for treatment: a conventional, with 10% chance of success, and a homoeopathic, with 0.1% chance (equal to that of a spontaneous remission). But, in accordance with her obscurantism, she believes that assigning anything except 50% for homoeopathy working would mean that we know the answer here, and since we can’t know, homoeopathy has indeed success chance of 50%.
Acting on these beliefs, she decides for the homoeopathic treatment. One of her desires is to survive, which leads to choosing the conventional treatment upon extrapolation, thus creating conflict with the actual decision. But isn’t it plausible that her another desire, namely to ever decide as if the chance of homoeopathy working were 50%, is enough strong to survive the extrapolation and take precedence upon the desire to survive? People have died for their beliefs many times.
Holding that the efficacy of homeopathics can never be established with any reasonable certainty != assigning a success chance of 50%.
Tell that to the hypothetical obscurantist.
Edit: I find it mildly annoying when, answering a comment or post, people point out obvious things whose relevance to the comment / post is dubious without further explanation. If you think that the non-equivalence of the mentioned beliefs somehow inplies the impossibility to extrapolate obscurantist values, please elaborate. If you just thought that I might have commited a sloppy inference and it would be cool to correct me on it, please don’t do that. It (1) derails the discussion to issues of uninteresting nitpickery and (2) motivates the commenters to clutter their comments with disclaimers in order to avoid being suspected of sloppy reasoning.