Eliezer used some pretty strong normative language when talking about having false beliefs, e.g. in Dark Side Epistemology:
Steven Kaas said, “Promoting less than maximally accurate beliefs is an act of sabotage. Don’t do it to anyone unless you’d also slash their tires.” Giving someone a false belief to protect—convincing them that the belief itself must be defended from any thought that seems to threaten it—well, you shouldn’t do that to someone unless you’d also give them a frontal lobotomy.
The quote from Eliezer is consistent with #1, since it’s bad to undermine people’s ability to achieve their goals.
More generally, you might believe that it’s morally normative to promote true beliefs (e.g. because they lead to better outcomes) but not believe that it’s epistemically normative, in a realist sense, to do so (e.g. the question I asked above, about whether you “should” have true beliefs even when there are no morally relevant consequences and it doesn’t further your goals).
Eliezer used some pretty strong normative language when talking about having false beliefs, e.g. in Dark Side Epistemology:
The quote from Eliezer is consistent with #1, since it’s bad to undermine people’s ability to achieve their goals.
More generally, you might believe that it’s morally normative to promote true beliefs (e.g. because they lead to better outcomes) but not believe that it’s epistemically normative, in a realist sense, to do so (e.g. the question I asked above, about whether you “should” have true beliefs even when there are no morally relevant consequences and it doesn’t further your goals).