Claim: One way in which instrumental and epistemic rationality diverge is that knowing certain facts can kill your motivation system.
(for instance, knowing how complicated a problem will be can stop you wanting to try and solve it, but it could be that once you solve part of it you’ll have the resources to solve the whole thing, and it could be in your interests to solve it)
So you’re less likely to work on a problem if you think it has been given a lot of high quality attention/you don’t think you have a comparative advantage?
Through comparing it to other similar problems, understanding the number of factors involved, asking people who have worked on similar problems, or many other methods.
Claim: One way in which instrumental and epistemic rationality diverge is that knowing certain facts can kill your motivation system.
(for instance, knowing how complicated a problem will be can stop you wanting to try and solve it, but it could be that once you solve part of it you’ll have the resources to solve the whole thing, and it could be in your interests to solve it)
So you’re less likely to work on a problem if you think it has been given a lot of high quality attention/you don’t think you have a comparative advantage?
Yes. But I’m not sure how that’s related.
How else does one know how complicated a problem is (if one hasn’t solved it)?
Through comparing it to other similar problems, understanding the number of factors involved, asking people who have worked on similar problems, or many other methods.
This seems pretty strange to me, and I would strongly disagree with it.