inbetween goals like nuclear power that are harder to change than the former category, but easier to change than the latter
Yep, this is kinda one of the things LW specializes in—helping people become better at changing their minds regarding things they are stubbornly wrong about.
I agree that human beings’ goals don’t neatly divide in to instrumental and terminal. This is just a model we use. I think Lumifer is using the model in a way that’s harmful—labeling stubborn incorrect beliefs as “terminal goals” amounts to throwing up your hands and saying it’s impossible to help people become better at changing their minds. Based on the what I’ve seen, this isn’t the case—although it’s difficult, it is possible to help people become better at changing their minds, and accomplishing this is highly valuable.
Yep, this is kinda one of the things LW specializes in—helping people become better at changing their minds regarding things they are stubbornly wrong about.
I agree that human beings’ goals don’t neatly divide in to instrumental and terminal. This is just a model we use. I think Lumifer is using the model in a way that’s harmful—labeling stubborn incorrect beliefs as “terminal goals” amounts to throwing up your hands and saying it’s impossible to help people become better at changing their minds. Based on the what I’ve seen, this isn’t the case—although it’s difficult, it is possible to help people become better at changing their minds, and accomplishing this is highly valuable.