I’ve been reading Eliezer’s recent stories with protagonists from dath ilan (his fictional utopia). Partly due to the style, I found myself bouncing off a lot of the interesting claims that he made (although it still helped give me a feel for his overall worldview). The part I found most useful was this page about the history of dath ilan, which can be read without much background context. I’m referring mostly to the exposition on the first 2⁄3 of the page, although the rest of the story from there is also interesting. One key quote from the remainder of the story:
“The next most critical fact about Earth is that from a dath ilani perspective their civilization is made entirely out of coordination failure. Coordination that fails on every scale recursively, where uncoordinated individuals assemble into groups that don’t express their preferences, and then those groups also fail to coordinate with each other, forming governments that offend all of their component factions, which governments then close off their borders from other governments. The entirety of Earth is one gigantic failure fractal. It’s so far below the multi-agent-optimal-boundary, only their professional economists have a five-syllable phrase for describing what a ‘Pareto frontier’ is, since they’ve never seen one in real life. Individuals sort of act in locally optimal equilibrium with their local incentives, but all of the local incentives are weird and insane, meaning that the local best strategy is also insane from any larger perspective. I cannot overemphasize how much you cannot predict Earth by reasoning that most features will have already been optimized into a not-much-further-improvable equilibrium. The closest thing you can do to optimality-based analysis is to think in terms of individually incentive-following responses to incredibly weird local situations. And the weird local situations cannot themselves be derived from first principles, because they are the bizarrely harmful equilibria of other weird incentives in other parts of the system. Or at least I can’t derive the weird situations from first principles, after two years of exposure and getting over the shock and trying to adapt. I would’ve been much better off if I’d tried to understand it as an alien society instead of a human one, in retrospect; and I expect the same would hold for an Earthling trying to understand dath ilan.”
My main update is that Eliezer has a very deep-rooted belief that the world is Lawful, in that it makes sense to talk about real-world intelligence, coordination, ethics, etc, as (very imperfect) approximations to their idealised mathematically-definable forms. (Note though that these are conclusions I’ve extrapolated from his fiction, which is a fairly unreliable method of inferring people’s beliefs.)
I’d say lots of other things he’s said support that update. Stuff about how your model of the world will be accurate if and only if you somehow approximate Bayes’ law, for example.
The dath ilan based fiction definitely helped me internalize the idea better though.
I’ve been reading Eliezer’s recent stories with protagonists from dath ilan (his fictional utopia). Partly due to the style, I found myself bouncing off a lot of the interesting claims that he made (although it still helped give me a feel for his overall worldview). The part I found most useful was this page about the history of dath ilan, which can be read without much background context. I’m referring mostly to the exposition on the first 2⁄3 of the page, although the rest of the story from there is also interesting. One key quote from the remainder of the story:
My main update is that Eliezer has a very deep-rooted belief that the world is Lawful, in that it makes sense to talk about real-world intelligence, coordination, ethics, etc, as (very imperfect) approximations to their idealised mathematically-definable forms. (Note though that these are conclusions I’ve extrapolated from his fiction, which is a fairly unreliable method of inferring people’s beliefs.)
I’d say lots of other things he’s said support that update. Stuff about how your model of the world will be accurate if and only if you somehow approximate Bayes’ law, for example.
The dath ilan based fiction definitely helped me internalize the idea better though.