One possible reason this may have been downvoted is that Less Wrong-ers tend not to distinguish between “true beliefs” and “what those true beliefs tell us about the world”. Okay, I may be committing mind projection fallacy, I don’t know. At least I think of them as kind of the same thing if those “true beliefs” are fundamental enough (which, it’s worth point out, kind of makes them “more true” if you have a reductionist viewpoint).
For example, knowledge of addition may tell you little in itself, but if you think about addition, it’s an abstraction of a useful operation that holds for any kind of object, which implicitly claims (it seems to me) that some physical laws are universal. The same idea could lead you to the notion of logic (since it has the idea that you can make universal statements about form, versus content).
I hope that’s not the reason for the downvote, because that completely misses the point of my comment.
My point is basically that the advice Luke gives—while very good advice—is not advice that follows from the simple desire to believe true things.
Believing true things is great. Believing true, interesting, useful things is better. Believing true, interesting, useful things while not believing false, trivial, useless things is better still. The content of our beliefs matters and that fact should be up front in the goal of rational inquiry. Again, the goal of rational inquiry is not simply to have true beliefs.
If simply having as many true beliefs as possible were really the goal of rational inquiry, then the best strategy would be to believe everything (assuming that is possible). By believing everything, you believe all of the true things. Sure, you also believe a lot of false things, but Luke didn’t ask about what it would look like if people really wanted to avoid believing falsehoods.
I don’t know if I was especially unclear in the earlier comment or if I was too uncharitable in my reading of Luke’s post. Whatever. It is still the case that the goal of maximizing the number of true beliefs one holds is a bad goal. A slightly better goal is to maximize the number of true beliefs one has while minimizing the number of false beliefs that one has. But such a rule leads, I think, to a strategy of acquiring safe but trivial beliefs. For example, the belief that 1293 cubed is equal to the sum of three consecutive prime numbers. Such a belief does not have nearly the same utility of the belief that bodies in rectilinear motion tend to remain in rectilinear motion unless impressed upon by a force, even if the latter belief is only approximately true.
I am not picking on Luke’s advice. The advice is great. I am picking on his starting point. I don’t think the goal as stated leads to the advice. Something is missing from the goal as stated.
Well, maybe I was being too charitable. Maybe you didn’t actually read the post, in which he outlines what kind of knowledge is most likely to be useful.
Of course, you have a good point regarding true vs useful beliefs, and I picked up on that in your original comment, and provided an angle of thinking about it that you also may not have read.
Anyway, as a rule, you should read a post through before commenting.
One possible reason this may have been downvoted is that Less Wrong-ers tend not to distinguish between “true beliefs” and “what those true beliefs tell us about the world”. Okay, I may be committing mind projection fallacy, I don’t know. At least I think of them as kind of the same thing if those “true beliefs” are fundamental enough (which, it’s worth point out, kind of makes them “more true” if you have a reductionist viewpoint).
For example, knowledge of addition may tell you little in itself, but if you think about addition, it’s an abstraction of a useful operation that holds for any kind of object, which implicitly claims (it seems to me) that some physical laws are universal. The same idea could lead you to the notion of logic (since it has the idea that you can make universal statements about form, versus content).
I hope that’s not the reason for the downvote, because that completely misses the point of my comment.
My point is basically that the advice Luke gives—while very good advice—is not advice that follows from the simple desire to believe true things.
Believing true things is great. Believing true, interesting, useful things is better. Believing true, interesting, useful things while not believing false, trivial, useless things is better still. The content of our beliefs matters and that fact should be up front in the goal of rational inquiry. Again, the goal of rational inquiry is not simply to have true beliefs.
If simply having as many true beliefs as possible were really the goal of rational inquiry, then the best strategy would be to believe everything (assuming that is possible). By believing everything, you believe all of the true things. Sure, you also believe a lot of false things, but Luke didn’t ask about what it would look like if people really wanted to avoid believing falsehoods.
I don’t know if I was especially unclear in the earlier comment or if I was too uncharitable in my reading of Luke’s post. Whatever. It is still the case that the goal of maximizing the number of true beliefs one holds is a bad goal. A slightly better goal is to maximize the number of true beliefs one has while minimizing the number of false beliefs that one has. But such a rule leads, I think, to a strategy of acquiring safe but trivial beliefs. For example, the belief that 1293 cubed is equal to the sum of three consecutive prime numbers. Such a belief does not have nearly the same utility of the belief that bodies in rectilinear motion tend to remain in rectilinear motion unless impressed upon by a force, even if the latter belief is only approximately true.
I am not picking on Luke’s advice. The advice is great. I am picking on his starting point. I don’t think the goal as stated leads to the advice. Something is missing from the goal as stated.
Well, maybe I was being too charitable. Maybe you didn’t actually read the post, in which he outlines what kind of knowledge is most likely to be useful.
Of course, you have a good point regarding true vs useful beliefs, and I picked up on that in your original comment, and provided an angle of thinking about it that you also may not have read.
Anyway, as a rule, you should read a post through before commenting.