I’m not sure what it means for society to be rewarded, or for society to benefit: I don’t think of “society” as a reward-seeking agent.
Societies persist, change, sometimes disappear. There is a class of beliefs and plans which have had large effects on societies—influencing their persistence, enacting large changes—and that is the class of scientific and technological beliefs.
Perhaps it would be more useful to say that scientific and technological beliefs have large effects on how societies fare, but smaller effects on how individuals fare. I’m not sure how true that is, but it sounds more testable.
I’m not sure what it means for society to be rewarded, or for society to benefit: I don’t think of “society” as a reward-seeking agent.
I think that the point is that the creators of LW/OB implicitly take that stance, that there is such a thing as a “better” outcome for the whole human race or the whole country, and that we ought to have better institutions to achieve these outcomes. And if you do take that stance, you end up with planning-esque rationality, because “scientific and technological beliefs have large effects on how societies fare, but smaller effects on how individuals fare”
I’m not sure what it means for society to be rewarded, or for society to benefit: I don’t think of “society” as a reward-seeking agent.
Societies persist, change, sometimes disappear. There is a class of beliefs and plans which have had large effects on societies—influencing their persistence, enacting large changes—and that is the class of scientific and technological beliefs.
Perhaps it would be more useful to say that scientific and technological beliefs have large effects on how societies fare, but smaller effects on how individuals fare. I’m not sure how true that is, but it sounds more testable.
I think that the point is that the creators of LW/OB implicitly take that stance, that there is such a thing as a “better” outcome for the whole human race or the whole country, and that we ought to have better institutions to achieve these outcomes. And if you do take that stance, you end up with planning-esque rationality, because “scientific and technological beliefs have large effects on how societies fare, but smaller effects on how individuals fare”
Thanks for this comment, Morendil, your rephrase makes the point very clearly: