Someone who reacts to gap in the sky with “its most likely a hallucination” may, with incredibly low probability, encounter the described hypothetical where it is not a hallucination, and lose out. Yet this person would perform much more optimally when their drink got spiced with LSD or if they naturally developed an equivalent fault.
What Eliezer is actually saying about this kind of hallucination:
I mean, in practice, I would tend to try and take certain actions intended to do something about the rather high posterior probability that I was hallucinating and be particularly wary of actions that sound like the sort of thing psychotic patients hallucinate, but this is an artifact of the odd construction of the scenario and wouldn’t apply to the more realistic and likely-to-be-actually-encountered case of the physics theory which implied we could use dark energy for computation or whatever.
The kind of ‘hallucination’ that is discussed in the posts is more about the issues of being forced you believe you are a boltzmann brain or a descendent human who is seamlessly hallucinating being an ‘ancestor’ before being able to believe that it is likely that there will be many humans in the future. This is an entirely different kind of issue.
What Eliezer is actually saying about this kind of hallucination:
The kind of ‘hallucination’ that is discussed in the posts is more about the issues of being forced you believe you are a boltzmann brain or a descendent human who is seamlessly hallucinating being an ‘ancestor’ before being able to believe that it is likely that there will be many humans in the future. This is an entirely different kind of issue.