There’s a far worse problem with the concept of ‘utility function’ as a static entity than that different generations have different preferences: The same person has very different preferences depending on his environment and neurochemistry. A heroin addict really does prefer heroin to a normal life (at least during his addiction). An ex-junkie friend of mine wistfully recalls how amazing heroin felt and how he realized he was failing out of school and slowly wasting away to death, but none of that mattered as long as there was still junk. Now, it’s not hard to imagine how in a few itterations of ‘maximizing changing utilities’ we all end up wire-headed one way or another. I see no easy solution to this problem. If we say “The utility function is that of unaltered, non-digital humans, living today,” then there will be no room for growth and change after the singularity. However, I don’t see an easy way of not falling into the local maximum of wire-heading one way or another at some point… Solutions welcome.
What’s wrong with wireheading? Seriously. Heroin is harmful for numerous health and societal reasons, but if we solve those problems with wireheading, I don’t see the problem with large portions of humanity choosing ultimate pleasure forever.
We could also make some workarounds: for instance, timed wireheading, where you wirehead for a year and then set your brain to disable wireheading for another year, or a more sophisticated Fun Theory based version of wireheading that allows for slightly more complex pleasures.
Combination of being broke, almost dying, mother-interference, naltrexone, and being institutionalized. I think there are many that do not quit though.
There are people who die from their drug habits but there are also many recovered former addicts. There are also people who sustain a drug habit without the rest of their life collapsing completely, even a heroin habit. It is clearly possible for people to make choices other than just taking another hit.
This is obviously true, but I’m not suggesting that all people will become heroin junkies. I’m using heroin addiction as an example of where neurochemistry changes directly change preference and therefore utility function- IE the ‘utility function’ is not a static entity. Neurochemistry differences among people are vast, and heroin doesn’t come close to a true ‘wire-head,’ and yet some percent of normal people are susceptible to having it alter their preferences to the point of death. After uploading/AI, interventions far more invasive and complete than heroin will be possible, and perhaps widely available. It is nice to think that humans will opt not to use them, and most people with their current preferences in tact might not even try (as many have never tried heroin), but if preferences are constantly being changed (as we will be able to do), then it seems likely than people will eventually slide down a slippery slope towards wire-heading, since, well, it’s easy.
I find the prospect of an AI changing people’s preferences to make them easier to satisfy rather disturbing. I’m not really worried about people changing their own preferences or succumbing en-masse to wireheading. It seems to me that if people could alter their own preferences then they would be much more inclined to move their preferences further away from a tendency towards wireheading. I see a lot more books on how to resist short term temptations (diet books, books on personal finance, etc.) than I do on how to make yourself satisfied with being fat or poor which suggests that generally people prefer preference changes that work in their longer term rather than short term interests.
There’s a far worse problem with the concept of ‘utility function’ as a static entity than that different generations have different preferences: The same person has very different preferences depending on his environment and neurochemistry. A heroin addict really does prefer heroin to a normal life (at least during his addiction). An ex-junkie friend of mine wistfully recalls how amazing heroin felt and how he realized he was failing out of school and slowly wasting away to death, but none of that mattered as long as there was still junk. Now, it’s not hard to imagine how in a few itterations of ‘maximizing changing utilities’ we all end up wire-headed one way or another. I see no easy solution to this problem. If we say “The utility function is that of unaltered, non-digital humans, living today,” then there will be no room for growth and change after the singularity. However, I don’t see an easy way of not falling into the local maximum of wire-heading one way or another at some point… Solutions welcome.
What’s wrong with wireheading? Seriously. Heroin is harmful for numerous health and societal reasons, but if we solve those problems with wireheading, I don’t see the problem with large portions of humanity choosing ultimate pleasure forever.
We could also make some workarounds: for instance, timed wireheading, where you wirehead for a year and then set your brain to disable wireheading for another year, or a more sophisticated Fun Theory based version of wireheading that allows for slightly more complex pleasures.
There a difference between people choosing wireheading and a clever AI making that choice for them.
Why did your ex-junkie friend quit? That may suggest a possible answer to your dilemma.
Combination of being broke, almost dying, mother-interference, naltrexone, and being institutionalized. I think there are many that do not quit though.
There are people who die from their drug habits but there are also many recovered former addicts. There are also people who sustain a drug habit without the rest of their life collapsing completely, even a heroin habit. It is clearly possible for people to make choices other than just taking another hit.
This is obviously true, but I’m not suggesting that all people will become heroin junkies. I’m using heroin addiction as an example of where neurochemistry changes directly change preference and therefore utility function- IE the ‘utility function’ is not a static entity. Neurochemistry differences among people are vast, and heroin doesn’t come close to a true ‘wire-head,’ and yet some percent of normal people are susceptible to having it alter their preferences to the point of death. After uploading/AI, interventions far more invasive and complete than heroin will be possible, and perhaps widely available. It is nice to think that humans will opt not to use them, and most people with their current preferences in tact might not even try (as many have never tried heroin), but if preferences are constantly being changed (as we will be able to do), then it seems likely than people will eventually slide down a slippery slope towards wire-heading, since, well, it’s easy.
I find the prospect of an AI changing people’s preferences to make them easier to satisfy rather disturbing. I’m not really worried about people changing their own preferences or succumbing en-masse to wireheading. It seems to me that if people could alter their own preferences then they would be much more inclined to move their preferences further away from a tendency towards wireheading. I see a lot more books on how to resist short term temptations (diet books, books on personal finance, etc.) than I do on how to make yourself satisfied with being fat or poor which suggests that generally people prefer preference changes that work in their longer term rather than short term interests.