It seems to me that this is bordering on saying that persons who made a different choice to yours are therefore not just wrong, but suffering from something, their brain is not working properly and they need to be taught how to make better choices. where “better” obviously means, more in line with the choice you would make.
It’s not about the choice in isolation, it’s the mismatch between stated goals and actions. If someone says they want to save money, and they spend tens of hours of their time to avoid a $5 expense when there was a $500 expense they could have avoided with the same effort, then they aren’t doing the best thing for their stated goal. Scope-insensitivity problems like this are very common, because quantifying and comparing things is a skill that not everyone has; this causes a huge amount of wasted resources and effort. That doesn’t mean everything that looks like an example of scope insensitivity actually is one; people may have other, unstated goals. In the classic study with birds and oil ponds, for example, people might spend a little money to make themself look good to the experimenter.
(I would also note that, while the classic birds-and-oil-ponds example study is often used as an illustrative example, most peoples’ belief that scope insensitivity exists and is a problem does not rely on that example, and other examples are easy to find.)
In the classic study with birds and oil ponds, for example, people might spend a little money to make themself look good to the experimenter.
so you agree with me that there my be a rational reason for them not to donate more money. which implies that it is not logical or rational of Eliezer Yudkowsky to ascribe that reason to a human brain error.
It’s not about the choice in isolation, it’s the mismatch between stated goals and actions. If someone says they want to save money, and they spend tens of hours of their time to avoid a $5 expense when there was a $500 expense they could have avoided with the same effort, then they aren’t doing the best thing for their stated goal. Scope-insensitivity problems like this are very common, because quantifying and comparing things is a skill that not everyone has; this causes a huge amount of wasted resources and effort. That doesn’t mean everything that looks like an example of scope insensitivity actually is one; people may have other, unstated goals. In the classic study with birds and oil ponds, for example, people might spend a little money to make themself look good to the experimenter.
(I would also note that, while the classic birds-and-oil-ponds example study is often used as an illustrative example, most peoples’ belief that scope insensitivity exists and is a problem does not rely on that example, and other examples are easy to find.)
so you agree with me that there my be a rational reason for them not to donate more money. which implies that it is not logical or rational of Eliezer Yudkowsky to ascribe that reason to a human brain error.
thank you.