Not sure what you are asking. My point was that the human notion of torture is apriori a tiny speck in the ocean of possible Turing machines. We don’t know nearly enough at this point to worry about accidental or intentional sim torture, so we shouldn’t, until at least a ballpark estimate with a few sigma confidence interval can be computed. This is a standard cognitive bias people fall into here and elsewhere: a failure of imagination. In this particular case someone brings up the horrifying possibility of tortured and killed 3^^^3 sims, and the emotional response to it is strong enough to block any rational analysis of the odds of it happening. Also reminds me of people conjuring aliens as human-looking, human-thinking and, of course, emotionally and sexually compatible. EY wrote about how silly this is at some length.
Is it not clear that in order to calculate the probability of any proposition, you need an actual definition of the proposition at hand?
My point was that the human notion of torture is apriori a tiny speck in the ocean of possible Turing machines. We don’t know nearly enough at this point to worry about accidental or intentional sim torture, so we shouldn’t, until at least a ballpark estimate with a few sigma confidence interval can be computed.
I think we agree that the only currently feasible arguments for any given value of P(a mind in such and such a mindspace is being tortured) are those based on heuristics.
However, you say these minds constitute “apriori a tiny speck”, and I do not endorse such a statement (given any reasonable definition of torture), unless you have some unstated, reasonable, heuristic reason for believing so. Ironically, “failure of imagination” is frequently a counterargument to people arguing that a certain reference class is a priori very small.
Is it not clear that in order to calculate the probability of any proposition, you need an actual definition of the proposition at hand?
My only reason is of the Pascal’s wager-type: you pick one possibility (tortured sims) out of unimaginably many, without providing any estimate of its abundance in the sea of all possibilities, why privilege it?
I don’t think most people talking about torture vs dust specks actually expect it to happen. And even if it actually could happen, it might be a smart idea to precommit to refuse to play any crazy games with an intelligence that wants to torture people. The point of the discussion is ethics. It’s a thought experiment. It’s not actually going to happen.
It was the line about torturing and killing 3^^^3 sims. It seemed like you were referencing all of the various thought experiments people have discussed here involving that number. I only mentioned torture vs specks, but the point is the same. I don’t think anyone ever actually expects something to happen in real life that involves the number 3^^^3.
Not sure what you are asking. My point was that the human notion of torture is apriori a tiny speck in the ocean of possible Turing machines. We don’t know nearly enough at this point to worry about accidental or intentional sim torture, so we shouldn’t, until at least a ballpark estimate with a few sigma confidence interval can be computed. This is a standard cognitive bias people fall into here and elsewhere: a failure of imagination. In this particular case someone brings up the horrifying possibility of tortured and killed 3^^^3 sims, and the emotional response to it is strong enough to block any rational analysis of the odds of it happening. Also reminds me of people conjuring aliens as human-looking, human-thinking and, of course, emotionally and sexually compatible. EY wrote about how silly this is at some length.
Is it not clear that in order to calculate the probability of any proposition, you need an actual definition of the proposition at hand?
I think we agree that the only currently feasible arguments for any given value of P(a mind in such and such a mindspace is being tortured) are those based on heuristics.
However, you say these minds constitute “apriori a tiny speck”, and I do not endorse such a statement (given any reasonable definition of torture), unless you have some unstated, reasonable, heuristic reason for believing so. Ironically, “failure of imagination” is frequently a counterargument to people arguing that a certain reference class is a priori very small.
My only reason is of the Pascal’s wager-type: you pick one possibility (tortured sims) out of unimaginably many, without providing any estimate of its abundance in the sea of all possibilities, why privilege it?
I don’t think most people talking about torture vs dust specks actually expect it to happen. And even if it actually could happen, it might be a smart idea to precommit to refuse to play any crazy games with an intelligence that wants to torture people. The point of the discussion is ethics. It’s a thought experiment. It’s not actually going to happen.
Not sure why you are bringing up specks vs torture, must be some misunderstanding.
It was the line about torturing and killing 3^^^3 sims. It seemed like you were referencing all of the various thought experiments people have discussed here involving that number. I only mentioned torture vs specks, but the point is the same. I don’t think anyone ever actually expects something to happen in real life that involves the number 3^^^3.