To give you something to argue against, consider the position that “saving the world” spreads because it acts as a superstimulus to do-gooders. There’s no credible evidence that aiming at saving the world has any effect on the probability of the world ending. By contrast, “the end is nigh” plackard syndrome is well known—and it diverts resources from other potentially-useful tasks.
X-risk reduction didn’t really act as a superstimulus to me (I had to convince myself). To accept that x-risk reduction is a massive opportunity, I also needed to accept both that x-risk was a massive problem and that I was going to hold a non-mainstream worldview for the foreseeable future. So, there was more bad stuff to think about on this issue than good stuff—it was more ugh field than superstimulus.
Superstimulii do not have to be positive. Traditional religions spread by invoking eternal damnation. The End of Days groups spread their message by invoking eternal oblivion.
As for holding non-mainstream views, that too is typical cult phenomenon. Weird beliefs act as markers of group membership. They show which tribe you belong to, so the ingroup can identify you. Normally the more crazy and weird the beliefs, the harder the signal is to convincingly fake.
Without meaning to doubt your powers of introspection, people don’t necessarily have to be aware of being influenced by superstimulii. Sometimes, if the stimulus becomes conscious, the effect is reduced. So. for example lipstick can be overdone, and often works best of a subliminal level. In the case of The End of Days groups, the superstimulus is pretty obvious, but the effect of on any particular individual it may not be.
Anyway, you can look to the left and see large positive utility, to the right and see large negative utility—but then you have to draw your own conclusions about why you are seeing those things.
To give you something to argue against, consider the position that “saving the world” spreads because it acts as a superstimulus to do-gooders. There’s no credible evidence that aiming at saving the world has any effect on the probability of the world ending. By contrast, “the end is nigh” plackard syndrome is well known—and it diverts resources from other potentially-useful tasks.
X-risk reduction didn’t really act as a superstimulus to me (I had to convince myself). To accept that x-risk reduction is a massive opportunity, I also needed to accept both that x-risk was a massive problem and that I was going to hold a non-mainstream worldview for the foreseeable future. So, there was more bad stuff to think about on this issue than good stuff—it was more ugh field than superstimulus.
That’s just me though; n=1.
Superstimulii do not have to be positive. Traditional religions spread by invoking eternal damnation. The End of Days groups spread their message by invoking eternal oblivion.
As for holding non-mainstream views, that too is typical cult phenomenon. Weird beliefs act as markers of group membership. They show which tribe you belong to, so the ingroup can identify you. Normally the more crazy and weird the beliefs, the harder the signal is to convincingly fake.
Without meaning to doubt your powers of introspection, people don’t necessarily have to be aware of being influenced by superstimulii. Sometimes, if the stimulus becomes conscious, the effect is reduced. So. for example lipstick can be overdone, and often works best of a subliminal level. In the case of The End of Days groups, the superstimulus is pretty obvious, but the effect of on any particular individual it may not be.
Anyway, you can look to the left and see large positive utility, to the right and see large negative utility—but then you have to draw your own conclusions about why you are seeing those things.