On one interpretation of the question: if you’re hallucinating then you aren’t in fact seeing ghosts, you’re just imagining that you’re seeing ghosts. The question isn’t asking about those scenarios, it’s only asking what you should believe in the scenarios where you really do see ghosts.
I mean, sure, but that does kinda answer the question in the question—“if event X happens, should you believe that event X is possible?” Well, yes, because it happened. I guess, in that case, the question could be more measuring something like “I, a Rationalist, would not believe in ghosts because that would lower my status in the Rationalist community, despite seeing strong evidence for it”
Sort of like asking “are you a Rationalist or are you just saying so for status points?”
Wait I think I might have biased it in favor of popular opinions, 2 sec
This should be better: https://gist.github.com/tailcalled/ec659e5dc24333449ce2a0c6e30ef912
Are you predicting the LW responses or is a model you made predicting them?
I find this opinion weird, probably because there are multiple reasonable interpretations with quite different truth-values.
I kinda disagree—if you see ghosts, almost all the probability space should be moving to “I am hallucinating”.
On one interpretation of the question: if you’re hallucinating then you aren’t in fact seeing ghosts, you’re just imagining that you’re seeing ghosts. The question isn’t asking about those scenarios, it’s only asking what you should believe in the scenarios where you really do see ghosts.
I mean, sure, but that does kinda answer the question in the question—“if event X happens, should you believe that event X is possible?” Well, yes, because it happened. I guess, in that case, the question could be more measuring something like “I, a Rationalist, would not believe in ghosts because that would lower my status in the Rationalist community, despite seeing strong evidence for it”
Sort of like asking “are you a Rationalist or are you just saying so for status points?”