I’ll admit I can’t make much sense of what you’re saying, but
anything that makes you care more about a winning bet than a losing bet of the same amount.
this is already in the post—you care about a winning bet because it saved you from hard times, you don’t care about a losing bet because you profit in other ways from this outcome.
I pointed out that the concluding exhortation misses the mark.
It absolutely does. I sacrificed some precision for clarity so that I could end with a ringing exhortation. When I have a moment I’ll probably footnote this.
Honestly, part of me is still a little confused about what I’m supposed to do at the ends of essays other than stop talking when I’ve said all the stuff I have to say.
ETA: On further reflection, the exhortation is almost right. The target you want to optimize for is “outcome in which money is worth more” but “outcome I’d really hate” is a cheaper target to compute—it’s emotionally salient, and can be quickly processed, probably in parallel—while still being a decent pointer to the true target—you can use a deliberative, serial process afterwards to pick the outcomes you actually should bet on.
The target you want to optimize for is “outcome in which money is worth more” but “outcome I’d really hate” is a cheaper target to compute—it’s emotionally salient, and can be quickly processed, probably in parallel—while still being a decent pointer to the true target—you can use a deliberative, serial process afterwards to pick the outcomes you actually should bet on.
I’ll admit I can’t make much sense of what you’re saying, but
this is already in the post—you care about a winning bet because it saved you from hard times, you don’t care about a losing bet because you profit in other ways from this outcome.
I didn’t disagree with the post, nor suggest the post was lacking. I pointed out that the concluding exhortation misses the mark.
It absolutely does. I sacrificed some precision for clarity so that I could end with a ringing exhortation. When I have a moment I’ll probably footnote this.
Honestly, part of me is still a little confused about what I’m supposed to do at the ends of essays other than stop talking when I’ve said all the stuff I have to say.
ETA: On further reflection, the exhortation is almost right. The target you want to optimize for is “outcome in which money is worth more” but “outcome I’d really hate” is a cheaper target to compute—it’s emotionally salient, and can be quickly processed, probably in parallel—while still being a decent pointer to the true target—you can use a deliberative, serial process afterwards to pick the outcomes you actually should bet on.
Exactly!