Is there another strategy you prefer? Afaict the options are
1) Have public taboo beliefs.
2) Have private beliefs that you lie about.
3) Remain deliberately agnostic about taboo but insufficiently important topics.
4) Get forever lucky, such that every taboo topic you investigate results in you honestly arriving at an allowed belief.
Whether 1) is at all compatible with having other career goals is a fact of the territory, and I expect in the US in 2024, there are topics where having taboo beliefs could totally end your career, for many values of career. (Leaving open whether there are such beliefs that are true, but, per the topic of this post, that’s not something you can learn without taking risks.)
2) seems even more prone to the effect you describe than 3).
My guess is you’re making a bid for 1), but I feel like a case for that should take into account the costs of believing X weighed against the costs of agnosticism about X, rather than a sweeping heuristic argument. (Where maybe the cost of agnosticism about X includes adjacent topics Y you’ll either have to include in your agnosticism or otherwise eat the cost of ~X connotations, though I’m skeptical about how often this will come up, and per my comment here I expect the ~Y->~X taboos will often be much smaller than the ~X taboo.)
Is there another strategy you prefer? Afaict the options are
1) Have public taboo beliefs.
2) Have private beliefs that you lie about.
3) Remain deliberately agnostic about taboo but insufficiently important topics.
4) Get forever lucky, such that every taboo topic you investigate results in you honestly arriving at an allowed belief.
Whether 1) is at all compatible with having other career goals is a fact of the territory, and I expect in the US in 2024, there are topics where having taboo beliefs could totally end your career, for many values of career. (Leaving open whether there are such beliefs that are true, but, per the topic of this post, that’s not something you can learn without taking risks.)
2) seems even more prone to the effect you describe than 3).
My guess is you’re making a bid for 1), but I feel like a case for that should take into account the costs of believing X weighed against the costs of agnosticism about X, rather than a sweeping heuristic argument. (Where maybe the cost of agnosticism about X includes adjacent topics Y you’ll either have to include in your agnosticism or otherwise eat the cost of ~X connotations, though I’m skeptical about how often this will come up, and per my comment here I expect the ~Y->~X taboos will often be much smaller than the ~X taboo.)