Maybe it’s only trivially different. But I’m imagining a genie that is sapient (so it’s not like the time machine...though I don’t know if the time machine pump thing is a coherent idea) and it’s not safe. Suppose, say, that it’s programed to fulfill any wish asked of it so as to produce two reactions: first, to satisfy the wisher that the wish was fulfilled as stated, and second, to make the wisher regret having wished for that. That seems to me to capture the ‘mischievous genie’ of lore, and it’s an idea EY doesn’t talk about in that article, except maybe to deny its possibility.
Anyway, with such a genie, wishing for it to do whatever you ought to wish for is probably the same as asking it what to wish for. I’d take the second option, because I’m not the world’s best person, and I’d want to think over hitting the ‘go’ button.
I suspect that to be able to evoke this reaction reliably, the 100%-jackass genie would have to explicitly exclude the “do what I ought to have wished for” option, and so is at least as smart as a safe genie.
Anyway, with such a genie, wishing for it to do whatever you ought to wish for is probably the same as asking it what to wish for. I’d take the second option, because I’m not the world’s best person, and I’d want to think over hitting the ‘go’ button.
I… do not follow at all, even after reading this paragraph a few times.
I suspect that to be able to evoke this reaction reliably, the 100%-jackass genie would have to explicitly exclude the “do what I ought to have wished for” option, and so is at least as smart as a safe genie.
I agree that it’s at least as smart as the safe genie, and I suppose it’s likely to be a even more complicated. The jackass genie needs to be able both to figure out what you really want, and to figure out how to betray that desire within the confines of your stated wish. I realize I do this with my son sometimes when he makes up crazy rules for games: I try to come up with ways to exploit the rule, so as to show why it’s not a good one. I guess that kind of makes me a jackass.
Anyway, I take it you agree that my jackass genie is one of the possibilities? Being smart doesn’t make it safe. And, as is the law of geniedom, it’s not allowed to refuse any of my wishes.
I… do not follow at all, even after reading this paragraph a few times.
Sorry to be unclear. You asked me how my suggestion was different from just telling the genie ‘just do whatever’s best’. I said that my suggestion is not very different. Only, maybe ‘do whatever’s best’ isn’t in my selfish interest. Maybe, for example, I ought to stop smoking crack or something. But even if it is best for me to stop smoking crack, I might just really like crack. So I want to know what’s in fact best for me before deciding to get it.
Maybe it’s only trivially different. But I’m imagining a genie that is sapient (so it’s not like the time machine...though I don’t know if the time machine pump thing is a coherent idea) and it’s not safe. Suppose, say, that it’s programed to fulfill any wish asked of it so as to produce two reactions: first, to satisfy the wisher that the wish was fulfilled as stated, and second, to make the wisher regret having wished for that. That seems to me to capture the ‘mischievous genie’ of lore, and it’s an idea EY doesn’t talk about in that article, except maybe to deny its possibility.
Anyway, with such a genie, wishing for it to do whatever you ought to wish for is probably the same as asking it what to wish for. I’d take the second option, because I’m not the world’s best person, and I’d want to think over hitting the ‘go’ button.
I suspect that to be able to evoke this reaction reliably, the 100%-jackass genie would have to explicitly exclude the “do what I ought to have wished for” option, and so is at least as smart as a safe genie.
I… do not follow at all, even after reading this paragraph a few times.
I agree that it’s at least as smart as the safe genie, and I suppose it’s likely to be a even more complicated. The jackass genie needs to be able both to figure out what you really want, and to figure out how to betray that desire within the confines of your stated wish. I realize I do this with my son sometimes when he makes up crazy rules for games: I try to come up with ways to exploit the rule, so as to show why it’s not a good one. I guess that kind of makes me a jackass.
Anyway, I take it you agree that my jackass genie is one of the possibilities? Being smart doesn’t make it safe. And, as is the law of geniedom, it’s not allowed to refuse any of my wishes.
Sorry to be unclear. You asked me how my suggestion was different from just telling the genie ‘just do whatever’s best’. I said that my suggestion is not very different. Only, maybe ‘do whatever’s best’ isn’t in my selfish interest. Maybe, for example, I ought to stop smoking crack or something. But even if it is best for me to stop smoking crack, I might just really like crack. So I want to know what’s in fact best for me before deciding to get it.