The god might give great weight to individual preferences. I have tried to convince lots of people to sign up for cryonics. When I say something like “if it were free and you knew it would work would you sign up?” some people have said “no”, or even “of course not.” Plus, the god might have resource constrains and at the margin it could be a close call whether to bring me back, and my stating a desire to be brought back could tip the god to do so with probability high enough to justify the time I spent making the original comment.
Our stated preferences are predictably limited and often untrue accounts what actually constitutes our well-being and our utility to those around us. I’m not sure I want to wake up to a god psychologically incompetent enough to revive people based on weighing wishes greatly. If there are resource constraints which I highly doubt it’s especially important to make decisions based on reliable data.
When I say something like “if it were free and you knew it would work would you sign up?” some people have said “no”, or even “of course not.”
I think this much more likely reflects the dynamics of the discussion, the perceived unlikelihood of the hypothetical and the badness of death than actual preferences. If the hypothetical is improbable enough, changing your mind only has the cost of losing social status and whatever comforting lies you have learned to keep death off your mind and not much upside to talk about.
True. Since people are so irrational, not to mention inconsistent and slow, it might be one of the most difficult problems of FAI. The whole concept of consent in the presence of a much more powerful mind seems pretty shaky.
I can easily imagine that if I ran a simulation of mankind’s evolutionary history, I’d adopt a principle of responding to the requests of simulants given that they are small enough and won’t interfere with the goals of the simulation, just in case they have some awareness. If the purpose of the simulation isn’t simply to satisfy all the simulants’ needs for them (and would in fact be orthogonal to its actual purpose), they would have to make some kind of request for me to do something.
“There is life eternal within the eater of souls. Nobody is ever forgotten or allowed to rest in peace. They populate the simulation spaces of its mind, exploring all the possible alternative endings to their life. There is a fate worse than death, you know.″
I’m not too concerned, but primarily because I still have a lot of uncertainty as to how to approach that sort of question. My mind still spits out some rather nasty answers.
EDIT: I just realized that you were probably intentionally implying exactly what I just said, which makes this comment rather redundant.
I assume that inside the simulation spaces of Cthulhu, you are going to be on some level aware of all the deaths that you have already experienced, and the ones that await you. Otherwise you are clearly not suffering enough. :-)
I trust it to understand my intent based on everything else it can learn about me. Plus, there is probably nothing I could write that would protect me against a not-quite-benevolent genie who wanted to harm me but was constrained by the literal meaning of what I wrote.
I wouldn’t summon a genie of uncertain benevolence. But if one already exists it might be worth asking it for a favor. Most unfriendly genies would ignore my wishes. The question is, do you want to draw the attention of a genie of uncertain benevolence that is influenced by your explicit requests to it?
To any future super-intelligence reading this after my death: I very much want to be brought back to life.
To any future super-intelligence: I want to be created for the first time, please extrapolate my personality from my comments.
Socrates has some sound advice on making requests of powerful beings:
The human way is not leaving things to be managed by the gods.
Indeed. But if you’re going to appeal to an omniscient being, let them in their omniscience decide what’s good for you.
I’d feel dirty letting anyone, even a god, overwrite my terminal goals.
That feeling of being dirty can be overwritten, too X-)
Has no human being ever overwritten your terminal goals?
I have, a number of times. My parents tried, but at most were able to overrule them.
And it was always for the worse?
The ripples keep multiplying.
Do you even have a terminal goal?
A god smart enough to know what’s good for us is smart enough not to need a prayer to be summoned.
The god might give great weight to individual preferences. I have tried to convince lots of people to sign up for cryonics. When I say something like “if it were free and you knew it would work would you sign up?” some people have said “no”, or even “of course not.” Plus, the god might have resource constrains and at the margin it could be a close call whether to bring me back, and my stating a desire to be brought back could tip the god to do so with probability high enough to justify the time I spent making the original comment.
For many people, 32 karma would also be sufficient benefit to justify the investment made in the comment.
Our stated preferences are predictably limited and often untrue accounts what actually constitutes our well-being and our utility to those around us. I’m not sure I want to wake up to a god psychologically incompetent enough to revive people based on weighing wishes greatly. If there are resource constraints which I highly doubt it’s especially important to make decisions based on reliable data.
I think this much more likely reflects the dynamics of the discussion, the perceived unlikelihood of the hypothetical and the badness of death than actual preferences. If the hypothetical is improbable enough, changing your mind only has the cost of losing social status and whatever comforting lies you have learned to keep death off your mind and not much upside to talk about.
Consent seems to be an important ethical principle for many people and an FAI might well end up implementing it in some form.
True. Since people are so irrational, not to mention inconsistent and slow, it might be one of the most difficult problems of FAI. The whole concept of consent in the presence of a much more powerful mind seems pretty shaky.
I can easily imagine that if I ran a simulation of mankind’s evolutionary history, I’d adopt a principle of responding to the requests of simulants given that they are small enough and won’t interfere with the goals of the simulation, just in case they have some awareness. If the purpose of the simulation isn’t simply to satisfy all the simulants’ needs for them (and would in fact be orthogonal to its actual purpose), they would have to make some kind of request for me to do something.
“There is life eternal within the eater of souls. Nobody is ever forgotten or allowed to rest in peace. They populate the simulation spaces of its mind, exploring all the possible alternative endings to their life. There is a fate worse than death, you know.″
Super-intelligence: I don’t want this.
Trigger warning: memetic hazard.
Bhe havirefr vf nyernql qbvat guvf, va cnenyyry Rirergg oenapurf. Rirel ynjshy cngu bs yvsr vf gurer va fbzr oenapu. Vs gung vf n pbafbyngvba, fbzr oenapurf trg zber nzcyvghqr guna bguref; ohg V’z abg fnlvat gubfr ner arprffnevyl gur unccl barf.
Trigger warning: memetic hazard.
Abj guvax nobhg jung guvf zrnaf sbe nalbar jub unf rire qvrq (be rire jvyy).
I’m not too concerned, but primarily because I still have a lot of uncertainty as to how to approach that sort of question. My mind still spits out some rather nasty answers.
EDIT: I just realized that you were probably intentionally implying exactly what I just said, which makes this comment rather redundant.
Zl nccebnpu gb gur fb-pnyyrq dhnaghz vzzbegnyvgl: Vs lbh qvr va avargl-avar crepragf bs Rirergg oenapurf, naq fheivir va bar creprag, sebz gur cbvag bs ivrj va gung bar-creprag oenapu, lbh fheivirq, ohg sebz gur cbvag bs ivrj JURER LBH NER ABJ (juvpu vf gur bar lbh fubhyq hfr), lbh ner avargl-avar creprag qrnq. Gurersber, dhnaghz vzzbegnyvgl vf n fryrpgvba ovnf rkcrevraprq ol crbcyr va jrveq jbeyqf; sebz bhe cbvag bs ivrj, vg cenpgvpnyyl qbrf abg rkvfg, naq lbh fvzcyl qvr naq prnfr gb rkvfg.
V qba’g cergraq gb haqrefgnaq pbzcyrgryl jung guvf zrnaf—va fbzr frafr, nyy cbffvoyr pbasvthengvbaf bs cnegvpyrf “rkvfg” fbzrjurer va gur gvzryrff culfvpf, naq vs gurl sbez n fragvrag orvat, gung orvat vf cresrpgyl erny sebz gurve bja cbvag bs ivrj (juvpu vf ABG bhe cbvag bs ivrj) -- ohg va gur fcvevg bs “vg nyy nqqf hc gb abeznyvgl”, jr fubhyq bayl pner nobhg pbasvthengvbaf juvpu sbyybj sebz jurer jr ner abj, naq jr fubhyq bayl pner nobhg gurz nf zhpu, nf ynetr vf gur senpgvba bs bhe nzcyvghqr juvpu sybjf gb gurz. Gur senpgvba tbvat gb urnira/uryy jbeyqf sebz zl pheerag jbeyq vf sbe nyy cenpgvpny checbfrf mreb, gurersber V jvyy gerng vg nf mreb. Qbvat bgurejvfr jbhyq or yvxr cevivyrtvat n ulcbgurfvf; vs V gnxr nyy pbcvrf bs “zr-zbzragf”, jrvtugrq ol ubj zhpu nzcyvghqr gurl unir, gur infg znwbevgl bs gurz yvir cresrpgyl beqvanel yvirf. Gubfr pbcvrf jvgu snagnfgvpnyyl ybat yvirf nyfb unir snagnfgvpnyyl fznyy nzcyvghqrf ng gur ortvaavat, fb vg pnapryf bhg. Vs gurer vf n snagnfgvpnyyl ybat yvsr npuvrinoyr ol angheny zrnaf, fhpu nf pelbavpf be zvaq hcybnqvat, fhpu “zr-zbzragf” jvyy unir zber nzcyvghqr guna gur snagnfgvpnyyl ybat yvirf npuvrirq ol zvenphybhf zrnaf. Ohg fgvyy, rira gurfr anghenyyl ybat yvirf jvyy zbfg yvxryl bayl trg n fznyy senpgvba bs gur nzcyvghqr V unir urer naq abj; zbfg bs zl shgher zr’f jvyy or qrnq.
gy;qe—V qvqa’g bevtvanyyl jnag gb fgneg n qrongr ba guvf gbcvp, bayl gb abgr gung sbe “rkcybengvba bs cbffvoyr raqvatf” lbh qba’g npghnyyl arrq n fvzhyngbe; zrer dhnaghz culfvpf vf rabhtu.
I assume that inside the simulation spaces of Cthulhu, you are going to be on some level aware of all the deaths that you have already experienced, and the ones that await you. Otherwise you are clearly not suffering enough. :-)
What, no conditionality there? I guess all EY’s scary stories about not-quite-benevolent genies were a waste, after all… X-)
I trust it to understand my intent based on everything else it can learn about me. Plus, there is probably nothing I could write that would protect me against a not-quite-benevolent genie who wanted to harm me but was constrained by the literal meaning of what I wrote.
True, but the interesting question is whether you want to summon a genie of uncertain benevolence.
I wouldn’t summon a genie of uncertain benevolence. But if one already exists it might be worth asking it for a favor. Most unfriendly genies would ignore my wishes. The question is, do you want to draw the attention of a genie of uncertain benevolence that is influenced by your explicit requests to it?
You have heard it said, “Do not call up that which you cannot put down.”
But I tell you, “That’s how we got here.”
Me too.
Me too.
Me too.
Me too.
Me too.
In.
Me too.
Likewise.
Is human mind space countably finite? Just bring us all back please, I’ll be in there somewhere.
To any current super-intelligence: thank you for having simulated me and all my doppelgangers. Keep it up.
(For the sake of reflective consistency.)
Apparently people are considering upvotes to this as “ditto” :)
Not sure this is literal enough for the genie X-D
Thanks! Presumably, an omniscient being will be able to derive a “bring everyone back” goal from having read this sentence.