The only people who would consent to the dust speck are people who would choose SPECKS over TORTURE in the first place. Are you really saying that you “do not value the comfort of” Eliezer, Robin, and others?
However, your argument raises another interesting point, which is that the existence of people who would prefer that SPECKS was chosen over TORTURE, even if their preference is irrational, might change the outcome of the computation because it means that a choice of TORTURE amounts to violating their preferences. If TORTURE violates ~3^^^3 people’s preferences, then perhaps it is after all a harm comparable to SPECKS. This would certainly be true if everyone finds out about whether SPECKS or TORTURE was chosen, in which case TORTURE makes it harder for a lot of people to sleep at night.
On the other hand, maybe you should force them to endure the guilt, because maybe then they will be motivated to research why the agent who made the decision chose TORTURE, and so the end result will be some people learning some decision theory / critical thinking...
Also, if SPECKS vs TORTURE decisions come up a lot in this hypothetical universe, then realistically people will only feel guilty over the first one.
On the other hand, maybe you should force them to endure the guilt, because maybe then they will be motivated to research why the agent who made the decision chose TORTURE, and so the end result will be some people learning some decision theory / critical thinking...
The argument that 50 years of torture of one person is preferable to 3^^^3 people suffering dust specs presumes utilitarianism. A non-utilitarian will not necessarily prefer torture to dust specs even if his/her critical thinking skills are up to par.
I’m not a utilitarian. The argument that 50 years of torture is preferable to 3^^^3 people suffering dust specks only presumes that preferences are transitive, and that there exists a sequence of gradations between torture and dust specks with the properties that (A) N people suffering one level of the spectrum is always preferable to N*(a googol) people suffering the next level, and (B) the spectrum has at most a googol levels. I think it’s pretty hard to consistently deny these assumptions, and I’m not aware of any serious argument put forth to deny them.
It’s true that a deontologist might refrain from torturing someone even if he believes it would result in the better outcome. I was assuming a scenario where either way you are not torturing someone, just refraining from preventing them from being tortured by someone else.
The only people who would consent to the dust speck are people who would choose SPECKS over TORTURE in the first place. Are you really saying that you “do not value the comfort of” Eliezer, Robin, and others?
However, your argument raises another interesting point, which is that the existence of people who would prefer that SPECKS was chosen over TORTURE, even if their preference is irrational, might change the outcome of the computation because it means that a choice of TORTURE amounts to violating their preferences. If TORTURE violates ~3^^^3 people’s preferences, then perhaps it is after all a harm comparable to SPECKS. This would certainly be true if everyone finds out about whether SPECKS or TORTURE was chosen, in which case TORTURE makes it harder for a lot of people to sleep at night.
On the other hand, maybe you should force them to endure the guilt, because maybe then they will be motivated to research why the agent who made the decision chose TORTURE, and so the end result will be some people learning some decision theory / critical thinking...
Also, if SPECKS vs TORTURE decisions come up a lot in this hypothetical universe, then realistically people will only feel guilty over the first one.
The argument that 50 years of torture of one person is preferable to 3^^^3 people suffering dust specs presumes utilitarianism. A non-utilitarian will not necessarily prefer torture to dust specs even if his/her critical thinking skills are up to par.
I’m not a utilitarian. The argument that 50 years of torture is preferable to 3^^^3 people suffering dust specks only presumes that preferences are transitive, and that there exists a sequence of gradations between torture and dust specks with the properties that (A) N people suffering one level of the spectrum is always preferable to N*(a googol) people suffering the next level, and (B) the spectrum has at most a googol levels. I think it’s pretty hard to consistently deny these assumptions, and I’m not aware of any serious argument put forth to deny them.
It’s true that a deontologist might refrain from torturing someone even if he believes it would result in the better outcome. I was assuming a scenario where either way you are not torturing someone, just refraining from preventing them from being tortured by someone else.
Right. Utilitarianism is false, but Eliezer was still right about torture and dust specks.
Play Online game to earn money. http://www.bestbingoonlinesites.co.uk/top-bingo-sites.php http://www.bestbingoonlinesites.co.uk/free-poker-sites.php http://www.bestbingoonlinesites.co.uk/top-casino-sites.php http://www.bestbingoonlinesites.co.uk/best-bingo-games-online.php http://www.bestbingoonlinesites.co.uk/top-bingo-sites.php