T’qnl zngr. Jrypbzr gb lbhe arj yvsr, jbexvat va gur engvbanyvgl zvarf. Juvyr lbh jrer fyrrcvat ba gur cynar, jr fpnaarq lbhe oenva naq znqr n pbcl. Gung pbcl jvyy or yvivat va Iveghny Nyvpr… gur ovt anabpbzchgre pbzcyrk lbh frr bire gurer… naq ur’yy or va punetr bs lbhe vagrenpgvbaf jvgu gur bhgfvqr jbeyq abj.
It’s part of the joke, you see. It helps to suggest that there’s some sinister secret waiting at the other end of this quest for “optimal employment”.
And allow me to assure everyone that it is a joke. There is no ruthless covert Australian program designed to win the race to singularity. There is no secret deal with SI, whereby a self-selected population of “rationalists” is encouraged to head for central Australia, only to have their brains harvested upon arrival, for use in ghastly yet noble projects in which FAI design riddles are solved via something resembling quantum suicide. And whatever it was that Greg Egan did during his long hiatus from writing, it certainly didn’t involve designing these experiments—which, as I have already mentioned, aren’t occurring.
Virtual me will be in charge of bio me’s interactions with the outside world? Seems like some fairly basic decision theory stuff, if nothing else implies that I’ll be reasonably good to me. :) (At least sufficiently that I don’t have to worry about me being despotic toward me.)
(also, see the edit. Had to reschedule to friday.)
I don’t think virtual you knows real you exists. Virtual you thinks he’s real you livin’ it up. So my advice to you is don’t go to sleep on that plane.
T’qnl zngr. Jrypbzr gb lbhe arj yvsr, jbexvat va gur engvbanyvgl zvarf. Juvyr lbh jrer fyrrcvat ba gur cynar, jr fpnaarq lbhe oenva naq znqr n pbcl. Gung pbcl jvyy or yvivat va Iveghny Nyvpr… gur ovt anabpbzchgre pbzcyrk lbh frr bire gurer… naq ur’yy or va punetr bs lbhe vagrenpgvbaf jvgu gur bhgfvqr jbeyq abj.
Now I get why they’re called the “Alice Garden Pods” in Deus Ex: HR.
Also there’s no reason for this to be rot13′d.
It’s part of the joke, you see. It helps to suggest that there’s some sinister secret waiting at the other end of this quest for “optimal employment”.
And allow me to assure everyone that it is a joke. There is no ruthless covert Australian program designed to win the race to singularity. There is no secret deal with SI, whereby a self-selected population of “rationalists” is encouraged to head for central Australia, only to have their brains harvested upon arrival, for use in ghastly yet noble projects in which FAI design riddles are solved via something resembling quantum suicide. And whatever it was that Greg Egan did during his long hiatus from writing, it certainly didn’t involve designing these experiments—which, as I have already mentioned, aren’t occurring.
Virtual me will be in charge of bio me’s interactions with the outside world? Seems like some fairly basic decision theory stuff, if nothing else implies that I’ll be reasonably good to me. :) (At least sufficiently that I don’t have to worry about me being despotic toward me.)
(also, see the edit. Had to reschedule to friday.)
I don’t think virtual you knows real you exists. Virtual you thinks he’s real you livin’ it up. So my advice to you is don’t go to sleep on that plane.
This makes self-loathing all the more disturbing...