“Perhaps it is the fear of being too late that is causing you distress. Perhaps you fear that humanity is going to be destroyed because you didn’t build an FAI soon enough. Perhaps you fear that your life will end some 10,000 years sooner than you’d like.”
Humanity’s alleged demise is not the only possible way he could be too late. I wonder where Eliezer would turn his attention if someone (or some group) solved the problems of FAI before him.
Eliezer has written a number of times about how comparing your intelligence and rationality to those around you is pointless (e.g. it’s never good enough to be good in comparison, etc.). This philosophy has thus far been directed at comparing one’s self to lower levels of cognition—but I don’t see why it shouldn’t work bottom up also. Learn from the levels above you, but do not lionize them. As we all aspire to embody the higher levels, I’m sure Jaynes must have also (an old vampire, but not old enough).
Eliezer: I don’t think we should worry about our particular positions on the bell curve, or set goals for where we want to be. Don’t fret over the possible limitations of your brain, doing so will not change them. Just work hard and try your best, always attempt to advance—push the limitations. Jaynes was struggling against his meat-brain too. It’s human—you both surpassed the village idiots and college professors, now the difference in levels becomes more and more negligible with each step approaching the limit. Everybody is working with meat designed by the “idiot god”. Push it to the limit, hate the limit, but don’t be self-conscious about it.
We all wish we had gotten an earlier start on things. The importance of them is perhaps something you have to learn as you grow.
It must have been intentional that all the Dystopia examples are almost one-to-one mappings of the real world? Except for the cognitive one. That one stands out as strange, perhaps intentionally—the message is that the world is fucked, and we’ve only one more chance as the last Dystopian calamity looms before us.
As to the assignment:
Economic Weirdtopia: The production economy is entirely automated. Supply is near infinite due to the constellation of this automation with asteroid mining. (The weird part is that the political will was somehow mustered to accomplish this.) Quite oddly, class inequalities are no longer sustainable—due to the occasional public slaughter of the rising bourgeoisie and power elites.
Sexual Weirdtopia: What you described as Utopia seems pretty damn weirdtopia to me.
Governmental Weirdtopia: Each person is a congressmen. “Leaders” are chosen by lot, or else elected on merit by representatives chosen by lot. Laws are written and interpretted by juries, who are themselves potentially open to prosecution for the verdicts which they render. Lawmakers can be charged criminally by the people for the laws they pass.
Technological Weirdtopia: The human race has turned into a civilization of AI flying around the solar system (in a Dyson sphere).
Cognitive Weirdtopia: ”