I’m not Eliezer, and perhaps not being an AGI researcher means that my answer is irrelevant, but I think that things can have a deep aesthetic value or meaning from which one could gain insights into things more important than AI or rationality. One of these things may be the ‘something to protect’ that Eliezer wrote about. Others may be intrinsic values to discover, to give your rationality purpose. If I could only keep one of a copy of the Gospels of Buddha or a copy of MITECS, I would keep the Gospels of Buddha, because it reminds me of the importance of terminal values like compassion. When I read GEB the ideas of interconnectedness, of patterns, and of meaning all left me with a clearer thought process than did reading Eliezer’s short paper on Coherent Extrapolated Volition, which was enjoyable but just didn’t seem to resonate in the same way. Calling these things ‘entertaining fluff’ may be losing sight of Eliezer’s 11th virtue: “The Art must have a purpose other than itself, or it collapses into infinite recursion.” That is all, of course, my humble opinion. Maybe having everyone read about and understand the dangers of black swans and unfriendly AI would be more productive than having them read about and understand the values of compassion and altruism; for if people do not understand the former, there may be no world left for the latter.
I’m not Eliezer, and perhaps not being an AGI researcher means that my answer is irrelevant, but I think that things can have a deep aesthetic value or meaning from which one could gain insights into things more important than AI or rationality. One of these things may be the ‘something to protect’ that Eliezer wrote about. Others may be intrinsic values to discover, to give your rationality purpose. If I could only keep one of a copy of the Gospels of Buddha or a copy of MITECS, I would keep the Gospels of Buddha, because it reminds me of the importance of terminal values like compassion. When I read GEB the ideas of interconnectedness, of patterns, and of meaning all left me with a clearer thought process than did reading Eliezer’s short paper on Coherent Extrapolated Volition, which was enjoyable but just didn’t seem to resonate in the same way. Calling these things ‘entertaining fluff’ may be losing sight of Eliezer’s 11th virtue: “The Art must have a purpose other than itself, or it collapses into infinite recursion.”
That is all, of course, my humble opinion. Maybe having everyone read about and understand the dangers of black swans and unfriendly AI would be more productive than having them read about and understand the values of compassion and altruism; for if people do not understand the former, there may be no world left for the latter.