This assumes that we’ll never have the technology to change our brain’s wiring to our liking? If we live in the post-scarcity utopia, why won’t you be able to just go change who you are as a person so that you’ll fully enjoy the new world?
But you have also written yourself a couple of years ago:
if aligned AGI gets here I will just tell it to reconfigure my brain not to feel bored, instead of trying to reconfigure the entire universe in an attempt to make monkey brain compatible with it. I sorta consider that preference a lucky fact about myself, which will allow me to experience significantly more positive and exotic emotions throughout the far future, if it goes well, than the people who insist they must only feel satisfied after literally eating hamburgers or reading jokes they haven’t read before.
And indeed, when talking specifically about the Fun Theory sequence itself, you said:
I think Eliezer just straight up tends not to acknowledge that people sometimes genuinely care about their internal experiences, independent of the outside world, terminally. Certainly, there are people who care about things that are not that, but Eliezer often writes as if people can’t care about the qualia—that they must value video games or science instead of the pleasure derived from video games or science.
It’s essential to my ability to enjoy life
This assumes that we’ll never have the technology to change our brain’s wiring to our liking? If we live in the post-scarcity utopia, why won’t you be able to just go change who you are as a person so that you’ll fully enjoy the new world?
https://www.lesswrong.com/posts/K4aGvLnHvYgX9pZHS/the-fun-theory-sequence
But you have also written yourself a couple of years ago:
And indeed, when talking specifically about the Fun Theory sequence itself, you said:
Do you no longer endorse this?