It’s my judgement that the paperclipper’s life is not worth living. By my standards, sure; objective morality makes no sense, so what other standards could I use?
The paperclipper’s own opinion matters to me, but not all that much.
Would you engage with a particular paperclipper in a discussion (plus observation etc.) to refine your views on whether its life is worth living? (We are straying away from a nominal AIXI-type definition of “the” paperclipper but I think your initial comment warrants that. Besides, even an AIXI agent depends on both terminal values and history.)
No, if I did so it’d hack my mind and convince me to make paperclips in my own universe. Assuming it couldn’t somehow use the communications channel to directly take over our universe.
But you have banned most of the means of approximating the experience of living such a life, no? In a general case you wouldn’t be justified in your claim (where by general case I mean the situation where I have strong doubts you know the other entity, not the case of “the” paperclipper). Do you have a proof that having a single terminal value excludes having a rich structure of instrumental values? Or does the way you experience terminal values overwhelm the way you experience instrumental values?
I stand by my reducto. What is the difference between clippy enjoying paperclips vs humans enjoying icecream, and me enjoying chocolate icecream vs you enjoying strawberry? Assuming none of them are doing things that give each other negative utility, such as clippy turning you into paperclips of me paying the icecream vendor to only purchase chocolate (more for me!)
It’s my judgement that the paperclipper’s life is not worth living. By my standards, sure; objective morality makes no sense, so what other standards could I use?
The paperclipper’s own opinion matters to me, but not all that much.
Would you engage with a particular paperclipper in a discussion (plus observation etc.) to refine your views on whether its life is worth living? (We are straying away from a nominal AIXI-type definition of “the” paperclipper but I think your initial comment warrants that. Besides, even an AIXI agent depends on both terminal values and history.)
No, if I did so it’d hack my mind and convince me to make paperclips in my own universe. Assuming it couldn’t somehow use the communications channel to directly take over our universe.
I’m not quite sure what you’re asking here.
Oh well, I haven’t thought of that. I was “asking” about the methodology for judging whether a life is worth living.
Whether or not I would enjoy living it, taking into account any mental changes I would be okay with.
For a paperclipper.. yeah, no.
But you have banned most of the means of approximating the experience of living such a life, no? In a general case you wouldn’t be justified in your claim (where by general case I mean the situation where I have strong doubts you know the other entity, not the case of “the” paperclipper). Do you have a proof that having a single terminal value excludes having a rich structure of instrumental values? Or does the way you experience terminal values overwhelm the way you experience instrumental values?
Assuming that clippy (or the cow, which makes more sense) feels “enjoyment”, aren’t you just failing to model them properly?
It’s feeling enjoyment from things I dislike, and failing to pursue goals I do share. It has little value in my eyes.
Which is why I, who like chocolate icecream, categorically refuse to buy vanilla or strawberry for my friends.
Nice strawman you’ve got there. Pity if something were to.. happen to it.
The precise tastes are mostly irrelevant, as you well know. Consider instead a scenario where your friend asks you to buy a dose of cocaine.
I stand by my reducto. What is the difference between clippy enjoying paperclips vs humans enjoying icecream, and me enjoying chocolate icecream vs you enjoying strawberry? Assuming none of them are doing things that give each other negative utility, such as clippy turning you into paperclips of me paying the icecream vendor to only purchase chocolate (more for me!)