at least assuming that people who fail to be Christian solely because they’ve never heard of (or seriously thought about) Christianity in the first place won’t go to hell.
I’ve thought about the idea enough to realize that (assuming one takes it seriously at all) the above is not guaranteed.
Well, people who failed to be Christian because they lived before Jesus ended up in limbo, according to Dante. I’m not sure if that’s based on any actual theology.
IIRC, the current stance of the Church is the reverse of that: atheism is a sin if you’ve heard of the idea of God but you refuse to think seriously about it, but not if despite thinking it through you still can’t believe it.
I think I read that in Youcat where it talks about the first commandment, but neither the Google Books nor the Amazon previews contain that part of the book.
That doesn’t sound plausible to me, but if you’re right, the right thing to do would be letting as many people as possible know about the issue, so that it’s more likely to be averted.
The way it works is: if people are keeping the basilisk a secret for the sake of protecting others (even if it increases their own punishment), that means that those people value protecting others over their own safety. Therefore, a more effective way to punish them, is to torture those they’re trying to protect.
In Newcomb’s a good agent will 1-box in emulator and 2-box in reality if it could tell apart sim and reality. Even a tiniest flaw in the emulation results in lack of incentive for following through with the basilisk threat. You need a very dumb decision theory for the agent to just torture people for no gain.
Yes, and in that case the basilisk isn’t a problem at all. My point is that under any decision-theoretic assumptions Eliezer’s strategy of secrecy doesn’t help.
I hope the downvotes of the parent are for taboo violation and not for content. When it comes to Roko’s Basilisk specifically (considering potential spooky acausal variants separately) Army’s solution is correct. With the caveat firmly in place I don’t believe even Eliezer would disagree with that. If he did then I would have to seriously reconsider my support for SIAI—it would indicate that he is someone who is likely to actually implement (or support the implementation of) the Basilisk’s glare.
I indeed suspect that someone is just downvoting all posts mentioning the basilisk regardless of content. (As for “[T]hat doesn’t sound plausible to me”, this is slightly less true now than when I wrote that post—see http://lesswrong.com/lw/2ft/open_thread_july_2010_part_2/64f2.)
Do you mean “not guaranteed that, given that hell exists, people who have never heard of it won’t go there”, or “not guaranteed that, given that hell exists and that people who have never heard of it won’t go there, it is equivalent to [the thing that should not be mentioned]”?
I’ve thought about the idea enough to realize that (assuming one takes it seriously at all) the above is not guaranteed.
Well, people who failed to be Christian because they lived before Jesus ended up in limbo, according to Dante. I’m not sure if that’s based on any actual theology.
IIRC, the current stance of the Church is the reverse of that: atheism is a sin if you’ve heard of the idea of God but you refuse to think seriously about it, but not if despite thinking it through you still can’t believe it.
Can you source that?
I think I read that in Youcat where it talks about the first commandment, but neither the Google Books nor the Amazon previews contain that part of the book.
I was actually referring to the basilisk.
You mean that gung onq guvat zvtug unccra rira gb gubfr jub unira’g urneq be gubhtug nobhg gung fpranevb?
Yes.
Consider using the term “Roko’s Basilisk” for clarity.
That doesn’t sound plausible to me, but if you’re right, the right thing to do would be letting as many people as possible know about the issue, so that it’s more likely to be averted.
The way it works is: if people are keeping the basilisk a secret for the sake of protecting others (even if it increases their own punishment), that means that those people value protecting others over their own safety. Therefore, a more effective way to punish them, is to torture those they’re trying to protect.
Are you sure you don’t want to at the very least rot-13 that? Some people here have explicitly said they’d rather not find out what the basilisk is.
In Newcomb’s a good agent will 1-box in emulator and 2-box in reality if it could tell apart sim and reality. Even a tiniest flaw in the emulation results in lack of incentive for following through with the basilisk threat. You need a very dumb decision theory for the agent to just torture people for no gain.
Yes, and in that case the basilisk isn’t a problem at all. My point is that under any decision-theoretic assumptions Eliezer’s strategy of secrecy doesn’t help.
Well, yea. The whole thing is just stupid, how-ever you look at it.
I hope the downvotes of the parent are for taboo violation and not for content. When it comes to Roko’s Basilisk specifically (considering potential spooky acausal variants separately) Army’s solution is correct. With the caveat firmly in place I don’t believe even Eliezer would disagree with that. If he did then I would have to seriously reconsider my support for SIAI—it would indicate that he is someone who is likely to actually implement (or support the implementation of) the Basilisk’s glare.
I indeed suspect that someone is just downvoting all posts mentioning the basilisk regardless of content. (As for “[T]hat doesn’t sound plausible to me”, this is slightly less true now than when I wrote that post—see http://lesswrong.com/lw/2ft/open_thread_july_2010_part_2/64f2.)
That is certainly not consistent with his behavior.
Do you mean “not guaranteed that, given that hell exists, people who have never heard of it won’t go there”, or “not guaranteed that, given that hell exists and that people who have never heard of it won’t go there, it is equivalent to [the thing that should not be mentioned]”?