The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression. It’s also likely that the various forms of rationalistic utilitarian altruism existing in certain corners of the Internet will eventually give rise to a distinctive ideology that will take its place in the spectrum of political views that count. It is even possible that some intersection of these two currents—the futurological rationalism on display at this site—will give rise to a politically minded movement or organization. This post, the earlier “Altruist Support” sequence by Giles, a few others show that there’s some desire to do this. However, as things stand, this desire is still too weak and formless for anyone to actually do anything, and if anyone did become worked-up and fanatical enough to organize seriously, the result would most likely be an irrelevant farce, a psychodrama only meaningful to half a dozen people.
The current post combines: complete blindness with respect to what’s involved in acquiring power at a national or international level; no sense of how embattled and precarious is the situation of futurist causes like cryonics and Friendly AI; and misplaced confidence in the correctness of the local belief system.
Let’s start with the political naivete. Rather than taking over openly, it’s proposed that the Conspiracy could settle for
a simple infiltration of the world’s extant political systems
I love the word “simple”. Look, politics isn’t a game of hide and seek. Ideological groups have the cohesion that they do because membership in the group depends on openly espousing the ideology. If you get to be head of the politburo of the Tragic Soulfulness League after years of dutifully endorsing the party line, and then, once you’re in charge, you announce to your colleagues that you actually believe in Maximum Happiness, what happens is that the next day, the media carry the tragically soulful news of the unfortunate accident which cut you down just at the beginning of your term in office, and your successor, the former deputy head, wiping away a tear, vows to uphold the principles of the tragic soul, just as you would have wanted.
the Conspiracy becomes the only major influence in world politics
A perfect picture of fanaticism… Apparently you think of political influence only in terms of belief systems. The perfect end state is that the one true belief system is triumphant! But political influence is also an expression just of the existence of a group of people; it means that the system knows about them, listens to them, contains their representatives. If the world still contains a billion Indians or three hundred million Americans, then India and America will continue to be major “influences” in world politics.
Now let’s turn to the author’s innocence regarding the situation of cryonics, etc, in the world.
we should devote fewer of our resources to cryonics and life extension, and focus on saving the lives of those to whom these technologies are currently beyond even a fevered dream
In other words, the microscopic number of highly embattled people who are currently working on these matters, should instead take on the causes which are already ubiquitously signposted as Good, and which already receive billions of dollars per year. The rationale proposed for this perspective is that when the Conspiracy is in charge, it will own all the resources of the world, so it will be able to afford to do both things at once.
Arandur, if you take this line of thought, you end up working neither on life extension nor on poverty alleviation, but simply on assuming power, with the plan of doing those promised good works at some unknown time in the future.
In passing, let’s consider what specific proposals are offered here, regarding the solution of recognized problems like war and starvation (as opposed to unrecognized problems like ageing or unfriendly AI)? The answers I see are (1) spend even more money on them (2) trust us to think of a better approach, we’re rationalists and that means we’re better at problem-solving.
At least an explicitly transhumanist agenda would bring something concrete and new to politics. With respect to the existing concerns of politics, this proposal offers no-one any reason to offer you a share of power or to support your aspirations.
Finally, fanatical faith in the correctness of the local philosophy and the way that it is just destined to empower the true believer:
It is demonstrable that one’s level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.
It is even more demonstrable that one’s level of self-identification as a rationalist has a direct correlation to the probability that one is irrelevant to anything of any significance, especially the sort of worldly affairs that you are talking about.
I’m being pulled off to bed, but from my skimming this looks like a very, very helpful critique. Thank you for posting it; I’ll peruse it as soon as I’m able. One note: I did note after posting this, but too late to make a meaningful change, that “we should support cryonics less” is rather a ridiculous notion, considering the people I’m talking to are probably not the same people who are working hardest on cryonics. So: oops.
The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression.
What does this mean, exactly? It’s something that without thinking about it I seem to intuitively understand, but my thinking falls apart when I try to examine it closely. It’s like zooming in on a picture and finding that no further pixels are part of the data.
Originally I wrote “It is inevitable that” there will be a politics of the Singularity. But it’s possible (e.g. AI hard takeoff) that a singularity could happen before the political culture digests the concept. So there are two ways in which more time equals more futurism in politics. First, the further into the human future we go, the more futurist becomes the general sensibility. Second, the longer the posthuman future holds off, the more time there is for this evolution of human culture to occur.
The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression. It’s also likely that the various forms of rationalistic utilitarian altruism existing in certain corners of the Internet will eventually give rise to a distinctive ideology that will take its place in the spectrum of political views that count. It is even possible that some intersection of these two currents—the futurological rationalism on display at this site—will give rise to a politically minded movement or organization. This post, the earlier “Altruist Support” sequence by Giles, a few others show that there’s some desire to do this. However, as things stand, this desire is still too weak and formless for anyone to actually do anything, and if anyone did become worked-up and fanatical enough to organize seriously, the result would most likely be an irrelevant farce, a psychodrama only meaningful to half a dozen people.
The current post combines: complete blindness with respect to what’s involved in acquiring power at a national or international level; no sense of how embattled and precarious is the situation of futurist causes like cryonics and Friendly AI; and misplaced confidence in the correctness of the local belief system.
Let’s start with the political naivete. Rather than taking over openly, it’s proposed that the Conspiracy could settle for
I love the word “simple”. Look, politics isn’t a game of hide and seek. Ideological groups have the cohesion that they do because membership in the group depends on openly espousing the ideology. If you get to be head of the politburo of the Tragic Soulfulness League after years of dutifully endorsing the party line, and then, once you’re in charge, you announce to your colleagues that you actually believe in Maximum Happiness, what happens is that the next day, the media carry the tragically soulful news of the unfortunate accident which cut you down just at the beginning of your term in office, and your successor, the former deputy head, wiping away a tear, vows to uphold the principles of the tragic soul, just as you would have wanted.
A perfect picture of fanaticism… Apparently you think of political influence only in terms of belief systems. The perfect end state is that the one true belief system is triumphant! But political influence is also an expression just of the existence of a group of people; it means that the system knows about them, listens to them, contains their representatives. If the world still contains a billion Indians or three hundred million Americans, then India and America will continue to be major “influences” in world politics.
Now let’s turn to the author’s innocence regarding the situation of cryonics, etc, in the world.
In other words, the microscopic number of highly embattled people who are currently working on these matters, should instead take on the causes which are already ubiquitously signposted as Good, and which already receive billions of dollars per year. The rationale proposed for this perspective is that when the Conspiracy is in charge, it will own all the resources of the world, so it will be able to afford to do both things at once.
Arandur, if you take this line of thought, you end up working neither on life extension nor on poverty alleviation, but simply on assuming power, with the plan of doing those promised good works at some unknown time in the future.
In passing, let’s consider what specific proposals are offered here, regarding the solution of recognized problems like war and starvation (as opposed to unrecognized problems like ageing or unfriendly AI)? The answers I see are (1) spend even more money on them (2) trust us to think of a better approach, we’re rationalists and that means we’re better at problem-solving.
At least an explicitly transhumanist agenda would bring something concrete and new to politics. With respect to the existing concerns of politics, this proposal offers no-one any reason to offer you a share of power or to support your aspirations.
Finally, fanatical faith in the correctness of the local philosophy and the way that it is just destined to empower the true believer:
It is even more demonstrable that one’s level of self-identification as a rationalist has a direct correlation to the probability that one is irrelevant to anything of any significance, especially the sort of worldly affairs that you are talking about.
I’m being pulled off to bed, but from my skimming this looks like a very, very helpful critique. Thank you for posting it; I’ll peruse it as soon as I’m able. One note: I did note after posting this, but too late to make a meaningful change, that “we should support cryonics less” is rather a ridiculous notion, considering the people I’m talking to are probably not the same people who are working hardest on cryonics. So: oops.
What does this mean, exactly? It’s something that without thinking about it I seem to intuitively understand, but my thinking falls apart when I try to examine it closely. It’s like zooming in on a picture and finding that no further pixels are part of the data.
Originally I wrote “It is inevitable that” there will be a politics of the Singularity. But it’s possible (e.g. AI hard takeoff) that a singularity could happen before the political culture digests the concept. So there are two ways in which more time equals more futurism in politics. First, the further into the human future we go, the more futurist becomes the general sensibility. Second, the longer the posthuman future holds off, the more time there is for this evolution of human culture to occur.
It’s interesting reading this old comment in light of Effective Altruism.