Do you insist on keeping simple pleasures because you fear you might lose them if cooler ones were available, i.e., increasing happiness set levels would eventually lower your total fun?
No, my opinion is that a cooler existence would make them meaningless. It’s not a question of fun or happiness: in a eutopia those are cheap commodities. It is more a question of identity and futility.
So what I’m getting at, might that be a factor? Do you have a specific vision of what a complex future of the kind you are opposing might look like?
I feel it’s important to say that I’m not opposing a future like that. I like AIs and robots and think that we need more of that in our lives. What I’m opposing is that for some people, it’s unnecessary and unwanted to increase the complexity of existence per se. I simply don’t value terminally complexity, so for me an existence which is built on that is simply an existence which I don’t prefer.
In other words, wouldn’t you always want to have as much fun as possible? (With “fun” in the fun theory sense that includes all your terminal values, not just pleasure.) It seems to me like this should be true for any properly functioning agent, although agents might disagree on what fun is.
That, in essence, is the central ‘dogma’ of your theory of fun. I’m telling you however that for some people (me, for example), that is just not true. I just don’t want to have more and more fun, it strikes me as meaningless and ‘childish’ (that is not an exact description. I would need to dig deeper into the precise feeling).
I would like to add to your theory of fun that there are agents who, once a certain level of fun/happiness is reached, they just need no more and can continue happily forever in that state of mind.
I can understand “maxing out” fun. I even suspect that my ability to experience fun is bounded and that even without post-singularity tech I might maximize it. I wonder, what happens then? Once all your values are fulfilled (and sustainability is not an issue), what do you do?
(Obviously, self-modify to not get bored and enjoy the ride, says the wirehead. I’m not so sure about that anymore.)
No, my opinion is that a cooler existence would make them meaningless. It’s not a question of fun or happiness: in a eutopia those are cheap commodities. It is more a question of identity and futility.
I feel it’s important to say that I’m not opposing a future like that. I like AIs and robots and think that we need more of that in our lives. What I’m opposing is that for some people, it’s unnecessary and unwanted to increase the complexity of existence per se. I simply don’t value terminally complexity, so for me an existence which is built on that is simply an existence which I don’t prefer.
That, in essence, is the central ‘dogma’ of your theory of fun. I’m telling you however that for some people (me, for example), that is just not true. I just don’t want to have more and more fun, it strikes me as meaningless and ‘childish’ (that is not an exact description. I would need to dig deeper into the precise feeling). I would like to add to your theory of fun that there are agents who, once a certain level of fun/happiness is reached, they just need no more and can continue happily forever in that state of mind.
Thanks, now I understand you much better.
I can understand “maxing out” fun. I even suspect that my ability to experience fun is bounded and that even without post-singularity tech I might maximize it. I wonder, what happens then? Once all your values are fulfilled (and sustainability is not an issue), what do you do?
(Obviously, self-modify to not get bored and enjoy the ride, says the wirehead. I’m not so sure about that anymore.)