Those properties that we think makes happy humans better than totally artificial smiling humans mimicking happy humans.
This I guessed.
You’d need to find it in order to grasp what it means to have a being that lacked moral value,
Why? “No moral value” has a clear decision-theoretic meaning, and referring to particular patterns that have moral value doesn’t improve on that understanding. Also, the examples of things that have moral value are easy to imagine.
“both ideas” refers to the distinct ways of explaining what sort of paperclip maximizer we’re talking about.
This I still don’t understand. You’d need to name two ideas. My intuition at grasping the intended meaning fails me often. One relevant idea that I see is that the paperclip maximizer lacks moral value. What’s the other, and how is it relevant?
This I guessed.
Why? “No moral value” has a clear decision-theoretic meaning, and referring to particular patterns that have moral value doesn’t improve on that understanding. Also, the examples of things that have moral value are easy to imagine.
This I still don’t understand. You’d need to name two ideas. My intuition at grasping the intended meaning fails me often. One relevant idea that I see is that the paperclip maximizer lacks moral value. What’s the other, and how is it relevant?