Wireheading isn’t a siren world, though. The point of the concept is that it looks like what we want, when we look at it from the outside, but actually, on the inside, something is very wrong. Example: a world full of people who are always smiling and singing about happiness because they will be taken out and shot if they don’t (Lilly Weatherwax’s Genua comes to mind). If the “siren world” fails to look appealing to (most) human sensibilities in the first place, as with wireheading, then it’s simply failing at siren.
The point is that we’re supposed to worry about what happens when we can let computers do our fantasizing for us in high resolution and real time, and then put those fantasies into action, as if we could ever actually do this, because there’s a danger in letting ourselves get caught up in a badly un-thought-through fantasy’s nice aspects without thinking about what it would really be like.
The problem being, no, we can’t actually do that kind of “automated fantasizing” in any real sense, for the same reason that fantasies don’t resemble reality: to fully simulate some fantasy in high resolution (ie: such that choosing to put it into action would involve any substantial causal entanglement between the fantasy and the subsequent realized “utopia”) involves degrees of computing power we just won’t have and which it just wouldn’t even be efficient to use that way.
Backwards chaining from “What if I had a Palantir?” does lead to thinking, “What if Sauron used it to overwhelm my will and enthrall me?”, which sounds wise except that, “What if I had a Palantir?” really ought to lead to, “That’s neither possible nor an efficient way to get what I want.”
Wireheading isn’t a siren world, though. The point of the concept is that it looks like what we want, when we look at it from the outside, but actually, on the inside, something is very wrong. Example: a world full of people who are always smiling and singing about happiness because they will be taken out and shot if they don’t (Lilly Weatherwax’s Genua comes to mind). If the “siren world” fails to look appealing to (most) human sensibilities in the first place, as with wireheading, then it’s simply failing at siren.
The point is that we’re supposed to worry about what happens when we can let computers do our fantasizing for us in high resolution and real time, and then put those fantasies into action, as if we could ever actually do this, because there’s a danger in letting ourselves get caught up in a badly un-thought-through fantasy’s nice aspects without thinking about what it would really be like.
The problem being, no, we can’t actually do that kind of “automated fantasizing” in any real sense, for the same reason that fantasies don’t resemble reality: to fully simulate some fantasy in high resolution (ie: such that choosing to put it into action would involve any substantial causal entanglement between the fantasy and the subsequent realized “utopia”) involves degrees of computing power we just won’t have and which it just wouldn’t even be efficient to use that way.
Backwards chaining from “What if I had a Palantir?” does lead to thinking, “What if Sauron used it to overwhelm my will and enthrall me?”, which sounds wise except that, “What if I had a Palantir?” really ought to lead to, “That’s neither possible nor an efficient way to get what I want.”