Not with a lobotomy, no. But with a more sophisticated brain surgery/wipe that caused me to value spending time in your house and making you happy and so forth- then yes, after the operation I would probably consider you a friend, or something quite like it.
Obviously, as a Toggle who has not yet undergone such an operation, I consider it a hostile and unfriendly act. But that has no bearing on what our relationship is after the point in time where you get to arbitrarily decide what our relationship is.
There’s a difference between creating someone with certain values and altering someone’s values. For one thing, it’s possible to prohibit messing with someone’s values, but you can’t create someone without creating them with values. It’s not like you can create an ideal philosophy student of perfect emptiness.
I don’t mean you can feasibly program an AI to do that. I just mean that it’s something you can tell a human to do and they’d know what you mean. I’m talking about deontological ethics, not programming a safe AI.
How about if I get some DNA from Kate Upton, tweak it for high sex drive, low intelligence, low initiative, pliability, and a desperation to please, and then I grow a woman from it? Is she my friend?
If you design someone to serve your needs without asking that you serve theirs, the word “friend” is misleading. Friendship is mutually beneficial. I believe friendship signifies a relationship between two people that can be defined in operational terms, not a qualia that one person has. You can’t make someone actually be your friend just by hypnotizing them to believe they’re your friend.
Belief and feeling is probably part of the definition. It’s hard to imagine saying 2 people are friends without knowing it. But I think the pattern of mutually-beneficial behavior is also part of it.
So if I lock you up in my house, and you try to run away, so I give you a lobotomy so that now you don’t run away, we’ve thereby become friends?
Not with a lobotomy, no. But with a more sophisticated brain surgery/wipe that caused me to value spending time in your house and making you happy and so forth- then yes, after the operation I would probably consider you a friend, or something quite like it.
Obviously, as a Toggle who has not yet undergone such an operation, I consider it a hostile and unfriendly act. But that has no bearing on what our relationship is after the point in time where you get to arbitrarily decide what our relationship is.
There’s a difference between creating someone with certain values and altering someone’s values. For one thing, it’s possible to prohibit messing with someone’s values, but you can’t create someone without creating them with values. It’s not like you can create an ideal philosophy student of perfect emptiness.
Only if you prohibit interacting with him in any way.
I don’t mean you can feasibly program an AI to do that. I just mean that it’s something you can tell a human to do and they’d know what you mean. I’m talking about deontological ethics, not programming a safe AI.
How about if I get some DNA from Kate Upton, tweak it for high sex drive, low intelligence, low initiative, pliability, and a desperation to please, and then I grow a woman from it? Is she my friend?
If you design someone to serve your needs without asking that you serve theirs, the word “friend” is misleading. Friendship is mutually beneficial. I believe friendship signifies a relationship between two people that can be defined in operational terms, not a qualia that one person has. You can’t make someone actually be your friend just by hypnotizing them to believe they’re your friend.
Belief and feeling is probably part of the definition. It’s hard to imagine saying 2 people are friends without knowing it. But I think the pattern of mutually-beneficial behavior is also part of it.
That too, but I would probably stress the free choice part. In particular, I don’t think friendship is possible across a large power gap.