This post by the same author answers your comment: https://carado.moe/surprise-you-want.html Freedom is just a heuristic; let’s call the actual thing we want for humans our values (which is what we hope Elua will return in this scenario). By definition, our values are everything we want, including possibly the abolition of anthropocentrism. What is meant here by freedom and utopia is “the best scenario”. It’s not about what our values are, it’s about a method proposed to reach them.
Why would it harm humans? Do you think that the expected value of thinking about it is negative because of how it might lead us to overlook some forms of alignment?
This post by the same author answers your comment: https://carado.moe/surprise-you-want.html
Freedom is just a heuristic; let’s call the actual thing we want for humans our values (which is what we hope Elua will return in this scenario). By definition, our values are everything we want, including possibly the abolition of anthropocentrism.
What is meant here by freedom and utopia is “the best scenario”. It’s not about what our values are, it’s about a method proposed to reach them.
I’ve read that post before. I dislike its narcissistic implications. Even if true, it’s something I think humans can only be harmed by thinking about.
Why would it harm humans?
Do you think that the expected value of thinking about it is negative because of how it might lead us to overlook some forms of alignment?