If you believe you are obligated to help others, you will instinctively come up with justifications why helping others means doing what you wanted to anyway, instead of selling off all your earthly possessions to feed the starving.
Of course, you can mitigate this by, y’know, actually trying.
You can only maximize one variable, and if you’re maximizing altruism, you’re not maximizing truth.
This is only necessarily the case if you’re on the Pareto frontier, which no human is. There are reasons to think that there are sometimes better ways to optimize X than trying to optimize X. (I agree that an altruist and a truthseeker would and (by their preferences) should do different things, but it’s not as simple as you make it sound.)
Of course, you can mitigate this by, y’know, actually trying.
This is only necessarily the case if you’re on the Pareto frontier, which no human is. There are reasons to think that there are sometimes better ways to optimize X than trying to optimize X. (I agree that an altruist and a truthseeker would and (by their preferences) should do different things, but it’s not as simple as you make it sound.)