I’m surprised. Do you mean you wouldn’t trade off a dust speck in your eye (in some post-singularity future where x-risk is settled one way or another) to avert the torture of a billion frogs, or of some noticeable portion of all frogs? If we plotted your attitudes to progressively more intelligent entities, where’s the discontinuity or discontinuities?
You’d need to change that to 10^6 specks and 10^15 frogs or something, because emotional reaction to choosing to kill the frogs is also part of the consequences of the decision, and this particular consequence might have moral value that outweighs one speck.
Your emotional reaction to a decision about human lives is irrelevant, the lives in question hold most of the moral worth, while with a decision to kill billions of cockroaches (to be safe from the question of moral worth of frogs), the lives of the cockroaches are irrelevant, while your emotional reaction holds most of moral worth.
Hopefully he still thinks there’s a small probability of frogs being able to experience pain, so that the expected suffering of frog torture would be hugely greater than a dust speck.
Do you mean you wouldn’t trade off a dust speck in your eye (in some post-singularity future where x-risk is settled one way or another) to avert the torture of a billion frogs, or of some noticeable portion of all frogs?
Depends. Would that make it harder to get frog legs?
Same questions to you, but with “rocks” for “frogs”.
Eliezer didn’t say he was 100% sure frogs weren’t objects of moral worth, nor is it a priori unreasonable to believe there exists a sharp cutoff without knowing where it is.
I’m surprised. Do you mean you wouldn’t trade off a dust speck in your eye (in some post-singularity future where x-risk is settled one way or another) to avert the torture of a billion frogs, or of some noticeable portion of all frogs? If we plotted your attitudes to progressively more intelligent entities, where’s the discontinuity or discontinuities?
You’d need to change that to 10^6 specks and 10^15 frogs or something, because emotional reaction to choosing to kill the frogs is also part of the consequences of the decision, and this particular consequence might have moral value that outweighs one speck.
Your emotional reaction to a decision about human lives is irrelevant, the lives in question hold most of the moral worth, while with a decision to kill billions of cockroaches (to be safe from the question of moral worth of frogs), the lives of the cockroaches are irrelevant, while your emotional reaction holds most of moral worth.
I’m not so sure. I’m no expert on the subject, but I suspect cockroaches may have moderately rich emotional lives.
Hopefully he still thinks there’s a small probability of frogs being able to experience pain, so that the expected suffering of frog torture would be hugely greater than a dust speck.
Depends. Would that make it harder to get frog legs?
Same questions to you, but with “rocks” for “frogs”.
Eliezer didn’t say he was 100% sure frogs weren’t objects of moral worth, nor is it a priori unreasonable to believe there exists a sharp cutoff without knowing where it is.