LIt isn’t obvious how similar your argument is to the very one in the link you provided, that “stupid people deserve to suffer!”?
No; my argument is analogous to the following, from the same link:
“Yes, sulfuric acid is a horrible painful death, and no, that mother of 5 children didn’t deserve it, but we’re going to keep the shops open anyway because we did this cost-benefit calculation.”
That is: “Yes, it’s unfortunate that some schizophrenics may be bilked unfairly, but we’re going to let people sign pet-rapture contracts anyway because we did this cost-benefit calculation.”
In any case, being schizophrenic is not some point on a continuous line from “expert rationalist” to “anti-reductionist”—it doesn’t respond to these kinds of incentives, so your upside isn’t even there.
Most rapture-believers aren’t schizophrenic. They’re mostly just people who arrive at beliefs via non-rational processes encouraged by their social structure. The upside seems pretty clear to me—in fact this is just a kind of prediction market.
No; my argument is analogous to the following, from the same link: …The upside seems pretty clear to me—in fact this is just a kind of prediction market.
You think people who bet poorly on prediction markets deserve to lose their money. You consider this to be a species of prediction market. Ergo, you believe that people who believe in the rapture deserve to lose their money.
That is: “Yes, it’s unfortunate that some schizophrenics may be bilked unfairly, but we’re going to let people sign pet-rapture contracts anyway because we did this cost-benefit calculation.”
Which? (Note: “I can spend money stolen from stupid people better than they can” isn’t so much a “cost benefit calculation” as it is the “standard thief’s rationalization”.)
Most rapture-believers aren’t schizophrenic. They’re mostly just people who arrive at beliefs via non-rational processes encouraged by their social structure.
That’s a good point. However,
I see the potential customers as people who have “contracted a minor case of reason”, for lack of a better term. They seek enough logical closure over their beliefs to care about their pets after a rapture. That’s evidence that their reflective equilibrium does not lie in going whole hog with the rapture thing, and it is not a mere case of weird preferences, but preferences you can expect to change in a way that will regret this purchase. This puts its closer to the dark arts/akrasia-pump/fraud category.
You think people who bet poorly on prediction markets deserve to lose their money
Premise denied. As a consequentialist, I don’t believe in “desert”. I don’t think criminals “deserve” to go to prison; I just reluctantly endorse the policy of putting them there (relative to not doing anything) because I calculate that the resulting incentive structure leads to better consequences. Likewise with prediction markets: for all that the suffering of indiviudals who can’t tell truth from falsehood may break my heart, it’s better on net if society at large is encouraged to do so.
I see the potential customers as people who have “contracted a minor case of reason”, for lack of a better term. They seek enough logical closure over their beliefs to care about their pets after a rapture. That’s evidence that their reflective equilibrium does not lie in going whole hog with the rapture thing,
I see it differently: as evidence that they actually believe their beliefs, in the sense of anticipating consequences, rather than merely professing and cheering.
and it is not a mere case of weird preferences, but preferences you can expect to change in a way that will regret this purchase
Yes, but only provided their beliefs became accurate. And being willing to cause them regret in order to incentivize accurate beliefs is the whole point.
Premise denied. As a consequentialist, I don’t believe in “desert”.
Right, you just believe in concepts functionally isomorphic to desert, purport having achieved a kind of enlightened view by this superficial disavowal of desert, and then time-suck your opponents through a terminology dispute. Trap spotted, and avoided.
Seriously, have you ever considered being charitable? Ever? Such as asking what data might have generated a comment, not automatically under the assumption that your interlocutor (notice I didn’t say “opponent”) is interested only in signaling enlightenment?
This is Less Wrong. In case you forgot.
The point of the paragraph whose first two sentences you quoted was to answer your question about what the cost-benefit calculation is, i.e. what the purpose of pet-rapture-contracts/prediction markets is. The benefit is that accurate beliefs are encouraged. It isn’t (noting your edit) that those who start out with better beliefs get to have more money, which they can make better use of. (That could potentially be a side benefit, but it isn’t the principal justification.)
No; my argument is analogous to the following, from the same link:
That is: “Yes, it’s unfortunate that some schizophrenics may be bilked unfairly, but we’re going to let people sign pet-rapture contracts anyway because we did this cost-benefit calculation.”
Most rapture-believers aren’t schizophrenic. They’re mostly just people who arrive at beliefs via non-rational processes encouraged by their social structure. The upside seems pretty clear to me—in fact this is just a kind of prediction market.
You think people who bet poorly on prediction markets deserve to lose their money. You consider this to be a species of prediction market. Ergo, you believe that people who believe in the rapture deserve to lose their money.
Which? (Note: “I can spend money stolen from stupid people better than they can” isn’t so much a “cost benefit calculation” as it is the “standard thief’s rationalization”.)
That’s a good point. However,
Premise denied. As a consequentialist, I don’t believe in “desert”. I don’t think criminals “deserve” to go to prison; I just reluctantly endorse the policy of putting them there (relative to not doing anything) because I calculate that the resulting incentive structure leads to better consequences. Likewise with prediction markets: for all that the suffering of indiviudals who can’t tell truth from falsehood may break my heart, it’s better on net if society at large is encouraged to do so.
I see it differently: as evidence that they actually believe their beliefs, in the sense of anticipating consequences, rather than merely professing and cheering.
Yes, but only provided their beliefs became accurate. And being willing to cause them regret in order to incentivize accurate beliefs is the whole point.
Right, you just believe in concepts functionally isomorphic to desert, purport having achieved a kind of enlightened view by this superficial disavowal of desert, and then time-suck your opponents through a terminology dispute. Trap spotted, and avoided.
Seriously, have you ever considered being charitable? Ever? Such as asking what data might have generated a comment, not automatically under the assumption that your interlocutor (notice I didn’t say “opponent”) is interested only in signaling enlightenment?
This is Less Wrong. In case you forgot.
The point of the paragraph whose first two sentences you quoted was to answer your question about what the cost-benefit calculation is, i.e. what the purpose of pet-rapture-contracts/prediction markets is. The benefit is that accurate beliefs are encouraged. It isn’t (noting your edit) that those who start out with better beliefs get to have more money, which they can make better use of. (That could potentially be a side benefit, but it isn’t the principal justification.)