But you have also written yourself a couple of years ago:
if aligned AGI gets here I will just tell it to reconfigure my brain not to feel bored, instead of trying to reconfigure the entire universe in an attempt to make monkey brain compatible with it. I sorta consider that preference a lucky fact about myself, which will allow me to experience significantly more positive and exotic emotions throughout the far future, if it goes well, than the people who insist they must only feel satisfied after literally eating hamburgers or reading jokes they haven’t read before.
And indeed, when talking specifically about the Fun Theory sequence itself, you said:
I think Eliezer just straight up tends not to acknowledge that people sometimes genuinely care about their internal experiences, independent of the outside world, terminally. Certainly, there are people who care about things that are not that, but Eliezer often writes as if people can’t care about the qualia—that they must value video games or science instead of the pleasure derived from video games or science.
But you have also written yourself a couple of years ago:
And indeed, when talking specifically about the Fun Theory sequence itself, you said:
Do you no longer endorse this?