“You know what they say the modern version of Pascal’s Wager is? Sucking up to as many Transhumanists as possible, just in case one of them turns into God.”—Julie from Crystal Nights by Greg Egan
It isn’t, at least, not in the sense of being a story whose punchline is ”...naq vg jnf nyy n fvzhyngvba”. You would already be foreseeing what Roko has mentioned by the end of the second screenful (and crying out, “Ab! Ab! Lbh znq sbbyf, unir V gnhtug lbh abguvat?”).
Reading through it now. There are two relevant words in Roko’s description, only one of which is obvious from the outset.
Still I’m not sure I fully agree with LW’s spoiler policy. I wouldn’t be reading this piece at all if not for Roko’s description of it. When the spoiler is that the text is relevant to an issue that’s actually discussed on Less Wrong (rather than mere story details, e.g. C3PO is R2D2′s father) then telling people about the spoiler is necessary...
This story sports an interesting variation on the mind projection fallacy anti-pattern. Instead of confusing intrinsic properties with those whose observation depends both on one’s mind and one’s object of study, this variation confuses intrinsically correct conclusions with those whose validity depends both on the configuration of the world and on the correct interpretation of the evidence. In particular, one of the characters would like the inhabitants of the simulation to reconstruct our modern, “correct” scientific theories, even though said theories are in fact not a correct description of the simulated world.
Here is a relevant (and spoiler-free) passage.
[The simulation’s] stars were just a planetarium-like backdrop, present only to help [the inhabitants of the simulation] get their notions of heliocentricity and inertia right
The mistake, of course, is that if the simulation’s sun is merely projected on a rotating dome, then heliocentricity isn’t right at all.
edit: it turns out that Eliezer has already generalized this anti-pattern from minds to worlds a while ago.
“You know what they say the modern version of Pascal’s Wager is? Sucking up to as many Transhumanists as possible, just in case one of them turns into God.”—Julie from Crystal Nights by Greg Egan
Okay, that one’s funny! :)
Also Crystal nights is a good story about [CENSORED] from [CENSORED].
In fact it’s almost exactly the mirror image of Eliezer’s Gung Nyvra Zrffntr, which is pretty awesome.
I hope that’s not a spoiler, because I haven’t read that story. If it is, please delete it or ROT13 it right now and don’t do it again.
It isn’t, at least, not in the sense of being a story whose punchline is ”...naq vg jnf nyy n fvzhyngvba”. You would already be foreseeing what Roko has mentioned by the end of the second screenful (and crying out, “Ab! Ab! Lbh znq sbbyf, unir V gnhtug lbh abguvat?”).
Reading through it now. There are two relevant words in Roko’s description, only one of which is obvious from the outset.
Still I’m not sure I fully agree with LW’s spoiler policy. I wouldn’t be reading this piece at all if not for Roko’s description of it. When the spoiler is that the text is relevant to an issue that’s actually discussed on Less Wrong (rather than mere story details, e.g. C3PO is R2D2′s father) then telling people about the spoiler is necessary...
So rot13?
I suppose. The comment could be:
“Also Crystal nights is a good story about a topic of some interest to the futurist/transhumanist element on LW, namely rfpncr sebz n fvzhyngvba.”
This story sports an interesting variation on the mind projection fallacy anti-pattern. Instead of confusing intrinsic properties with those whose observation depends both on one’s mind and one’s object of study, this variation confuses intrinsically correct conclusions with those whose validity depends both on the configuration of the world and on the correct interpretation of the evidence. In particular, one of the characters would like the inhabitants of the simulation to reconstruct our modern, “correct” scientific theories, even though said theories are in fact not a correct description of the simulated world.
Here is a relevant (and spoiler-free) passage.
The mistake, of course, is that if the simulation’s sun is merely projected on a rotating dome, then heliocentricity isn’t right at all.
edit: it turns out that Eliezer has already generalized this anti-pattern from minds to worlds a while ago.