A lot of people care about the culture wars because they don’t believe the singularity is coming soon. Yet a lot of people who do believe it is coming soon still seem just as invested (e.g. Elon Musk, Peter Thiel, and others on the left wing).
Why?
Because the results of culture wars now will determine the post-singularity culture.
I agree that the culture wars as fought now will influence what the great masses of people will believe in the day before AGI is created. Is it a relevant input to what they will believe in 50 years after that, though?
Is there an implicit assumption of some convergence of singularities? Or that the near term doesn’t matter because the vastly bigger long term can’t be predicted?
Rather, an implicit assumption that normative culture tends to propagate top-down rather than bottom-up. Thus, influencing mass culture now seems like a losing strategy relative to influencing the culture of those who will in the future control AGI (if we manage to have controllable AGI).
Do you believe that is a good outcome? → If not, do you believe an ASI aligned to you would? → If not, are you expecting an ASI aligned to ‘all of present human culture’? → If not: If it were aligned to Elon Musk in particular, would they really endorse {the lightcone-wide enforcement of a particular culture} post-augmentation? → If so, they in any case do not need to try to promote their preferred views in the near-term for a first decisive AI aligned to themself to be able to cause that.[1]
In most futures I expect, there won’t be a singular “the post-singularity culture”; either eventually something better happens (note the reachable universe is vast enough to harmlessly contain many cultures/spiritualities) or an unaligned decisive AI takes over.
Why do you expect a singular post-singularity culture, unless you are expecting the first decisive AI to be aligned to some evil person who wants that?
I guess that Musk either (1) feigns concern over current politicized topics to gain political power/allies, or (2) is actually concerned, having maybe self-deceived into thinking they care until they actually became an entity which does, and is in any case acting irrationally according to their own values.
Because the results of culture wars now will determine the post-singularity culture.
Can you give an example of a result now which will determine the post-singularity culture in a really good/bad way?
PS: I edited my question post to include “question 2,” what do you think about it?
I agree that the culture wars as fought now will influence what the great masses of people will believe in the day before AGI is created. Is it a relevant input to what they will believe in 50 years after that, though?
Is there an implicit assumption of some convergence of singularities? Or that the near term doesn’t matter because the vastly bigger long term can’t be predicted?
Rather, an implicit assumption that normative culture tends to propagate top-down rather than bottom-up. Thus, influencing mass culture now seems like a losing strategy relative to influencing the culture of those who will in the future control AGI (if we manage to have controllable AGI).
Elon Musk, Peter Thiel, and the like — the people the OP mentions — are shaping up to be the ones controlling the Singularity (if anyone does).
I think this is wrong for a lot of reasons.
Do you believe that is a good outcome? → If not, do you believe an ASI aligned to you would? → If not, are you expecting an ASI aligned to ‘all of present human culture’? → If not: If it were aligned to Elon Musk in particular, would they really endorse {the lightcone-wide enforcement of a particular culture} post-augmentation? → If so, they in any case do not need to try to promote their preferred views in the near-term for a first decisive AI aligned to themself to be able to cause that.[1]
In most futures I expect, there won’t be a singular “the post-singularity culture”; either eventually something better happens (note the reachable universe is vast enough to harmlessly contain many cultures/spiritualities) or an unaligned decisive AI takes over.
Why do you expect a singular post-singularity culture, unless you are expecting the first decisive AI to be aligned to some evil person who wants that?
I guess that Musk either (1) feigns concern over current politicized topics to gain political power/allies, or (2) is actually concerned, having maybe self-deceived into thinking they care until they actually became an entity which does, and is in any case acting irrationally according to their own values.
Some of these conditionals (e.g. the first two) are conjunctive, but most are disjunctive, by the way.