I have slightly refined my understanding of the work. First, I didn’t realize it was set in an existing fictional universe, an RPG called Pathfinder. Second, I think Eliezer wrote (tweeted?) that it’s meant to be a work on “hard-core decision theory”. So it’s clearly HPMOR 2.0. The result is as if Bourbaki decided to write a play with Samuel Beckett—it’s not something I would read for pleasure, but it’s a work I might try to understand, because I’m interested in the author’s oeuvre and intellectual trajectory, and because I might need to navigate it at some point.
Dath Ilan is apparently a kind of high-IQ libertopia. Again, it’s an interesting concept, but I have to say that a society based on economic calculation through and through, seems to be one in which extremely selfish personalities set the tone for everything. Not everything called selfish is bad—I am especially thinking of transhumanism; humanity’s relative indifference to the prospect of radical life extension might be its most foolishly self-denying trait—but a culture in which exploitative acts are restrained only by rational self-interest is surely less than optimal. (But perhaps that’s not the entirety of how Dath Ilan works.)
I said someone should extract the valuable lessons. David Udell posted Dath Ilan’s safety principles for a tool AI, that was interesting.
I have slightly refined my understanding of the work. First, I didn’t realize it was set in an existing fictional universe, an RPG called Pathfinder. Second, I think Eliezer wrote (tweeted?) that it’s meant to be a work on “hard-core decision theory”. So it’s clearly HPMOR 2.0. The result is as if Bourbaki decided to write a play with Samuel Beckett—it’s not something I would read for pleasure, but it’s a work I might try to understand, because I’m interested in the author’s oeuvre and intellectual trajectory, and because I might need to navigate it at some point.
Dath Ilan is apparently a kind of high-IQ libertopia. Again, it’s an interesting concept, but I have to say that a society based on economic calculation through and through, seems to be one in which extremely selfish personalities set the tone for everything. Not everything called selfish is bad—I am especially thinking of transhumanism; humanity’s relative indifference to the prospect of radical life extension might be its most foolishly self-denying trait—but a culture in which exploitative acts are restrained only by rational self-interest is surely less than optimal. (But perhaps that’s not the entirety of how Dath Ilan works.)