You shouldn’t be mixing “rational agents win” with “rational societies lose”. If you one-box on Newcomb’s Problem (as motivated by “rational agents win”) then you probably cooperate on the Prisoner’s Dilemma with similar agents; these are widely regarded as almost the same problem.
What happened to the value you placed on knowledge and rationality?
It was an instrumental value of saving lives in the first place… he said untruthfully; but still, you see the point.
But society teaches the opposite: that mere life has a tremendous value, and anything you do with your life has negligible additional value. That’s why it’s controversial to execute criminals, but not controversial to lock them up in a bare room for 20 years.
What if the criminals you execute might otherwise stand a decent chance of living forever?
Society tells you to work to make yourself more valuable. Then it tells you that when you reason morally, you must assume that all lives are equally valuable. You can’t have it both ways.
I disagree with a lot of this, but finally upvoted just for that one argument.
In real life, rational agents routinely fail to coordinate on PD problems. Perhaps they would coordinate, if they were more rational. In that case, there is a valley of bad rationality between religion and PD-satisficing rationality.
What if the criminals you execute might otherwise stand a decent chance of living forever?
I was inferring the values of the majority of the population from their actions. The majority of the population doesn’t think people have a decent chance of living forever in this world.
there is a valley of bad rationality between religion and PD-satisficing rationality
I hadn’t seen this insight expressed so clearly before, thank you.
The majority of the population doesn’t think people have a decent chance of living forever in this world.
If we’re reasoning from the values of the majority, the majority are religious, and are hoping that there is a non-zero chance that during those years in jail, you might be saved, and wind up spending eternity in heaven rather than hell. Of course, most prisons are...shall we say less than optimally designed for this purpose.
Probably though we should assume that evolution built people to cooperate about the right amount for their ancestral environment, neither too much nor too little, and that cultures then promoted excess cooperation from a gene’s eye view because your tendency towards cooperation has larger benefits to me than costs to you so I will pay more to create it than you will to avoid it.
In that case, there is a valley of bad rationality between religion and PD-satisficing rationality.
Perhaps this should be a major goal for our community—hopefully to shepherd people safely from one end of the valley to the other, but even failing that, simply to stand on the other side, waving and yelling, so that people know the opposite ledge exists, and have motivation in their journey.
You shouldn’t be mixing “rational agents win” with “rational societies lose”. If you one-box on Newcomb’s Problem (as motivated by “rational agents win”) then you probably cooperate on the Prisoner’s Dilemma with similar agents; these are widely regarded as almost the same problem.
It was an instrumental value of saving lives in the first place… he said untruthfully; but still, you see the point.
What if the criminals you execute might otherwise stand a decent chance of living forever?
I disagree with a lot of this, but finally upvoted just for that one argument.
In real life, rational agents routinely fail to coordinate on PD problems. Perhaps they would coordinate, if they were more rational. In that case, there is a valley of bad rationality between religion and PD-satisficing rationality.
I was inferring the values of the majority of the population from their actions. The majority of the population doesn’t think people have a decent chance of living forever in this world.
I hadn’t seen this insight expressed so clearly before, thank you.
If we’re reasoning from the values of the majority, the majority are religious, and are hoping that there is a non-zero chance that during those years in jail, you might be saved, and wind up spending eternity in heaven rather than hell. Of course, most prisons are...shall we say less than optimally designed for this purpose.
Probably though we should assume that evolution built people to cooperate about the right amount for their ancestral environment, neither too much nor too little, and that cultures then promoted excess cooperation from a gene’s eye view because your tendency towards cooperation has larger benefits to me than costs to you so I will pay more to create it than you will to avoid it.
Perhaps this should be a major goal for our community—hopefully to shepherd people safely from one end of the valley to the other, but even failing that, simply to stand on the other side, waving and yelling, so that people know the opposite ledge exists, and have motivation in their journey.