It seems irrational to give up on dieting merely because the process can be complicated.
Expected value calculation > your ‘seeming’.
That’s not a good reason to give up on FAI, so why should it be for dieting?
Because creating an FAI has (strictly!) greater expected benefits than one person successfully losing weight? To the extent that the rhetorical question is ridiculous.
it still promises the most bang-for-the-buck in increasing your productivity, your self-image and your quality of life.
I’d wager you’ve never been overweight. Strap a few dozen lbs around your waist or don them as a vest, see what it does to your daily routine. We’ve done that once in some class or other, and I’ve bordered the 30 myself as well, from time to time. You’re affected constantly, we’re as of yet embodied agents, not free floating minds. What’s the likelier explanation for a lack of action, expected value calculations or—here it comes - ‘akrasia’.
Because creating an FAI has (strictly!) greater expected benefits than one person successfully losing weight?
Depends from whose point of view. E.g. passing away in the knowledge that you’ve contributed to the eventual creation of FAI (which gives you fuzzies, or at least utilons) can be outweighed by living decades with more mental energy (which also contributes to your development efforts) and a better self-image.
That aside, that a task is a complicated problem/puzzle to be solved can be an incentive to solve it in and of itself, especially for certain kinds of people.
This seems false.
Assuming the increase in productivity, self image and quality of life (consider the metabolic syndrome, preventing decades of injecting insuline can have quite the impact on your QALY) to be fixed/constant for an individual, “true” or “false” does depend on how easy/hard it would be for that individual to efficiently attain and keep a lower BMI. For metabolically priviledged people, or just those with an easy to fix problem such as hypothyreodism, the statement is probably true. For someone who for whatever reason cannot lose any weight whatever he tries (within his motivational reach given his current energy levels … there’s a catch-22 present), it would be false.
Eliezer has mentioned many of the things he has tried to lose weight (including ketogenic diets and even clenbuterol). I’ve tried all those he has mentioned. The difference is for me they work (I call it ‘cutting’ and can merrily play around with my body composition all sorts of ways). But if, like Eliezer, I had expended huge amounts of effort and my body did not respond significantly then I would update my expectations.
What’s the likelier explanation for a lack of action, expected value calculations or—here it comes - ‘akrasia’.
Expected value calculations. Unless you are making accusations of lies—outright fabrication of self reports.
Depends from whose point of view. E.g. passing away in the knowledge that you’ve contributed to the eventual creation of FAI (which gives you fuzzies, or at least utilons) can be outweighed by living decades with more mental energy (which also contributes to your development efforts) and a better self-image.
For the purpose of declaring an accusation of irrationality false the relevant point of view is Eliezer’s. If Eliezer had someone else’s values then it would make sense to evaluate the rationality of a given choice for him according to those other values.
For metabolically priviledged people, or just those with an easy to fix problem such as hypothyreodism, the statement is probably true.
Yes (or at least it would be up there on the list). It just isn’t true in this case.
Depends from whose point of view. E.g. passing away in the knowledge that you’ve contributed to the eventual creation of FAI (which gives you fuzzies, or at least utilons) can be outweighed by living decades with more mental energy (which also contributes to your development efforts) and a better self-image.
Pretty sure most people involved in FAI efforts are fatoring in more than warm fuzzies in their EU calculations.
Expected value calculation > your ‘seeming’.
Because creating an FAI has (strictly!) greater expected benefits than one person successfully losing weight? To the extent that the rhetorical question is ridiculous.
This seems false.
I’d wager you’ve never been overweight. Strap a few dozen lbs around your waist or don them as a vest, see what it does to your daily routine. We’ve done that once in some class or other, and I’ve bordered the 30 myself as well, from time to time. You’re affected constantly, we’re as of yet embodied agents, not free floating minds. What’s the likelier explanation for a lack of action, expected value calculations or—here it comes - ‘akrasia’.
Depends from whose point of view. E.g. passing away in the knowledge that you’ve contributed to the eventual creation of FAI (which gives you fuzzies, or at least utilons) can be outweighed by living decades with more mental energy (which also contributes to your development efforts) and a better self-image.
That aside, that a task is a complicated problem/puzzle to be solved can be an incentive to solve it in and of itself, especially for certain kinds of people.
Assuming the increase in productivity, self image and quality of life (consider the metabolic syndrome, preventing decades of injecting insuline can have quite the impact on your QALY) to be fixed/constant for an individual, “true” or “false” does depend on how easy/hard it would be for that individual to efficiently attain and keep a lower BMI. For metabolically priviledged people, or just those with an easy to fix problem such as hypothyreodism, the statement is probably true. For someone who for whatever reason cannot lose any weight whatever he tries (within his motivational reach given his current energy levels … there’s a catch-22 present), it would be false.
Eliezer has mentioned many of the things he has tried to lose weight (including ketogenic diets and even clenbuterol). I’ve tried all those he has mentioned. The difference is for me they work (I call it ‘cutting’ and can merrily play around with my body composition all sorts of ways). But if, like Eliezer, I had expended huge amounts of effort and my body did not respond significantly then I would update my expectations.
Things that are expected to fail have low expected value. Sometimes you need to shut up and multiply instead of shut up and do the impossible.
Expected value calculations. Unless you are making accusations of lies—outright fabrication of self reports.
For the purpose of declaring an accusation of irrationality false the relevant point of view is Eliezer’s. If Eliezer had someone else’s values then it would make sense to evaluate the rationality of a given choice for him according to those other values.
Yes (or at least it would be up there on the list). It just isn’t true in this case.
Pretty sure most people involved in FAI efforts are fatoring in more than warm fuzzies in their EU calculations.