I don’t think this is right, or at least it doesn’t hit the crux.
People on a vegan diet should in a utopian society be the ones who are most interested in truth about the nutritional challenges on a vegan diet, as they are the ones who face the consequences. The fact that they aren’t reflects the fact that they are not optimizing for living their own life well, but instead for convincing others of veganism.
Marketing like this is the simplest (and thus most common?) way for ideologies to keep themselves alive. However, it’s not clear that it’s the only option. If an ideology is excellent at truthseeking, then this would presumably by itself be a reason to adopt it, as it would have a lot of potential to make you stronger.
Rationalism is in theory supposed to be this. In practice, rationalism kind of sucks at it, I think because it’s hard and people aren’t funding it much and maybe also all the best rationalists start working in AI safety or something.
There’s some complications to this story though. As you say, there is no such thing as an epistemic environment that has not (in a metaphorical sense) declared war on you. Everyone does marketing, and so everyone perceives full truthseeking as a threat, and so you’d make a lot of enemies through doing this. A compromise would be a conspiracy which does truthseeking in private to avoid punishment, but such a conspiracy is hardly an ideology, and also it feels pretty suspicious to organize at scale.
I don’t think this is right, or at least it doesn’t hit the crux.
People on a vegan diet should in a utopian society be the ones who are most interested in truth about the nutritional challenges on a vegan diet, as they are the ones who face the consequences. The fact that they aren’t reflects the fact that they are not optimizing for living their own life well, but instead for convincing others of veganism.
Marketing like this is the simplest (and thus most common?) way for ideologies to keep themselves alive. However, it’s not clear that it’s the only option. If an ideology is excellent at truthseeking, then this would presumably by itself be a reason to adopt it, as it would have a lot of potential to make you stronger.
Rationalism is in theory supposed to be this. In practice, rationalism kind of sucks at it, I think because it’s hard and people aren’t funding it much and maybe also all the best rationalists start working in AI safety or something.
There’s some complications to this story though. As you say, there is no such thing as an epistemic environment that has not (in a metaphorical sense) declared war on you. Everyone does marketing, and so everyone perceives full truthseeking as a threat, and so you’d make a lot of enemies through doing this. A compromise would be a conspiracy which does truthseeking in private to avoid punishment, but such a conspiracy is hardly an ideology, and also it feels pretty suspicious to organize at scale.