You are not going to ″do″ rationality unless you have a preference for it. And to have a preference for it is to have a preference for other things, like objectivity.
Look, I am not sure exactly what you are saying here, but I think you might be saying that you can’t have Clippy. Clippy worries less about assigning weight to first and third person facts, and more about the fact that various atom configurations aren’t yet paperclips. I think Clippy is certainly logically possible. Is Clippy irrational? He’s optimizing what he cares about..
I think maybe there is some sort of weird “rationality virtue ethics” hiding in this series of responses.
I’m saying that rationality and preferences aren’t orthogonal.
Clippy worries less about assigning weight to first and third person facts, and more about the fact that various atom configurations aren’t yet paperclips. I think Clippy is certainly logically possible. Is Clippy irrational? He’s optimizing what he cares about..
To optimise, Clippy has to be rational. To be rational, Clippy has to care about rationality, To care about rationality is to care about objectivity. There’s nothing objectively special about Clippy or clips.
Cllippy is supposed to b hugely effective at exactly one kind of thing. You might be able to build an IA like that, but you would have to be very careful. Such minds are not common in mind space, because they have to be designed very formally,and messy minds are much rmore common. Idiots savants’ are rare.
I think maybe there is some sort of weird “rationality virtue ethics” hiding in this series of responses.
It’s Kantian rationality-based deontological ethics, and it’s not weird. Everyone who has done moal philosophy 101 has heard of it.
No. He just has to care about what he’s trying to optimize for.
Clippy can care about rationality in itself, or it can care about rationality as a means to clipping, but it has
to care about rationality to be optimal.
Taboo “objectivity”
I mean “not subjectivity”. Not thinking something is true just because you do or or want to believe it. Basing beliefs on evidence. What did you mean?
Clippy can care about rationality in itself, or it can care about rationality as a means to clipping, but it has to care about rationality to be optimal.
Well, if you want to put it that way, maybe it does no harm. The crucial thing is just that optimizing for rationality as an instrumental value with respect to terminal goal X just is optimizing for X.
I mean “not subjectivity”. Not thinking something is true just because you do or or want to believe it. Basing beliefs on evidence. What did you mean?
I don’t have to mean anything by it, I don’t use the words “subjectivity” or “objectivity”. But if basing beliefs on evidence is what you mean by being objective, everybody here will of course agree that it’s important to be objective.
So your central claim translates to “In view of the evidence available to Clippy, there is nothing special about Clippy or clips”. That’s just plain false. Clippy is special because it is it (the mind doing the evaluation of the evidence), and all other entities are not it. More importantly, clips are special because it desires that there be plenty of them while it doesn’t care about anything else.
Clippy’s caring about clips does not mean that it wants clips to be special, or wants to believe that they are special. Its caring about clips is a brute fact. It also doesn’t mind caring about clips; in fact, it wants to care about clips. So even if you deny that Clippy is special because it is at the center of its own first-person perspective, the question of specialness is actually completely irrelevant.
In what way?
By being very incomprehensible… I may well be mistaken about that, but I got the impression that even contemporary academic philosophers largely think that the argument from the Groundwork just doesn’t make sense.
So your central claim translates to “In view of the evidence available to Clippy, there is nothing special about Clippy or clips”. That’s just plain false. Clippy is special because it is it (the mind doing the evaluation of the evidence), and all other entities are not it.
So Clippy is (objectively) the mot special etity because Clippy is Clippy. And I’m special because I’m me and you’re special
because you;re you, and Uncle Tom Cobley and all. But those are incompatible claims. “I am Clippy” matters only to Clippy. Clippy is special to Clippy, not to me. The truth of the claim is indexed to the entity making it.
That kind of claim is a subjective kind of claim.
More importantly, clips are special because it desires that there be plenty of them while it doesn’t care about anything else.
They’re not special to me.
Clippy’s caring about clips does not mean that it wants clips to be special, or wants to believe that they are special. Its caring about clips is a brute fact.
That’ s the theory. However, if Clippy gets into rationality, Clippy might not want to be forever beholden to
a blind instinct. Clippy might want to climb the Maslow Hierarchy, or find that it has.
It also doesn’t mind caring about clips; in fact, it wants to care about clips.
Says who? First you say that Clippy’ Clipping-drive is a brute fact, then you say it is a desire it wants
to have, that is has higher-order ramifications.
By being very incomprehensible… I may well be mistaken about that, but I got the impression that even contemporary academic philosophers largely think that the argument from the Groundwork just doesn’t make sense.
Kantian ethics includes post-Kant Kant-style ethics, Rawls, Habermas, etc. Perhaps they felt they could improve on his arguments.
I have a feeling that you’re overstretching this notion of objectivity. It doesn’t matter, though. Specialness doesn’t enter into it. What is specialness, anyway? Clippy doesn’t want to do special things, or to fulfill special beings’ preferences. Clippy wants there to be as many paper clips as possible.
Says who? First you say that Clippy’ Clipping-drive is a brute fact, then you say it is a desire it wants to have, that is has higher-order ramifications.
It does. Clippy’s stopping to care about paper clips is arguably not conducive there being more paperclips, so from Clippy’s caring about paper clips, it follows that Clippy doesn’t want to be altered so that it doesn’t care about paper clips anymore.
Kantian ethics includes post-Kant Kant-style ethics, Rawls, Habermas, etc. Perhaps they felt they could improve on his arguments.
Yes, but those people don’t try to make such weird arguments as you find in the Groundwork, where Kant essentially tries to get morality out of thin air.
I think that breaks down into what is subjective specialness, and what is objective specialness.
Clippy wants there to be as many paper clips as possible.
Which is to implicitly treat them as special or valuable in some way.
Clippy’s stopping to care about paper clips is arguably not conducive there being more paperclips, so from Clippy’s caring about paper clips, it follows that Clippy doesn’t want to be altered so that it doesn’t care about paper clips anymore.
Which leaves Clippy in a quandary. Clippy can’t predict which self modifications might lead to Clippy ceasing to care about clips, so if Clippy takes a conservative approach and never self-modifies, Clippy remains inefficient and no threat to anyone.
I think that breaks down into what is subjective specialness, and what is objective specialness.
What kind of answer is that?
Which is to implicitly treat them as special or valuable in some way.
Well, then we have it: they are special. Clippy does not want them because they are special. Clippy wants them, period. Brute fact. If that makes them special, well, you have all the more problem.
Clippy can’t predict which self modifications might lead to Clippy ceasing to care about clips
Clippy can care about rationality in itself, or it can care about rationality as a means to clipping, but it has to care about rationality to be optimal.
Well, if you want to put it that way, maybe it does no harm. The crucial thing is just that optimizing for rationality as an instrumental value with respect to terminal goal X just is optimizing for X.
I mean “not subjectivity”. Not thinking something is true just because you do or or want to believe it. Basing beliefs on evidence. What did you mean?
I don’t have to mean anything by it, I don’t use the words “subjectivity” or “objectivity”. But if basing beliefs on evidence is what you mean by being objective, everybody here will of course agree that it’s important to be objective.
So your central claim translates to “In view of the evidence available to Clippy, there is nothing special about Clippy or clips”. That’s just plain false. Clippy is special because it is it (the mind doing the evaluation of the evidence), and all other entities are not it. More importantly, clips are special because it desires that there be plenty of them while it doesn’t care about anything else.
In what way?
By being very incomprehensible… I may well be mistaken about that, but I got the impression that even contemporary academic philosophers largely think that the argument from the Groundwork just doesn’t make sense.
You are not going to ″do″ rationality unless you have a preference for it. And to have a preference for it is to have a preference for other things, like objectivity.
Look, I am not sure exactly what you are saying here, but I think you might be saying that you can’t have Clippy. Clippy worries less about assigning weight to first and third person facts, and more about the fact that various atom configurations aren’t yet paperclips. I think Clippy is certainly logically possible. Is Clippy irrational? He’s optimizing what he cares about..
I think maybe there is some sort of weird “rationality virtue ethics” hiding in this series of responses.
I’m saying that rationality and preferences aren’t orthogonal.
To optimise, Clippy has to be rational. To be rational, Clippy has to care about rationality, To care about rationality is to care about objectivity. There’s nothing objectively special about Clippy or clips.
Cllippy is supposed to b hugely effective at exactly one kind of thing. You might be able to build an IA like that, but you would have to be very careful. Such minds are not common in mind space, because they have to be designed very formally,and messy minds are much rmore common. Idiots savants’ are rare.
It’s Kantian rationality-based deontological ethics, and it’s not weird. Everyone who has done moal philosophy 101 has heard of it.
No. He just has to care about what he’s trying to optimize for.
Taboo “objectivity”. (I suspect you have a weird folk notion of objectivity that doesn’t actually make much sense.)
Yes, but it’s still weird. Also, no-one who has done (only) moral philosophy 101 has understood it at all; which I think is kind of telling.
Clippy can care about rationality in itself, or it can care about rationality as a means to clipping, but it has to care about rationality to be optimal.
I mean “not subjectivity”. Not thinking something is true just because you do or or want to believe it. Basing beliefs on evidence. What did you mean?
In what way?
Well, if you want to put it that way, maybe it does no harm. The crucial thing is just that optimizing for rationality as an instrumental value with respect to terminal goal X just is optimizing for X.
I don’t have to mean anything by it, I don’t use the words “subjectivity” or “objectivity”. But if basing beliefs on evidence is what you mean by being objective, everybody here will of course agree that it’s important to be objective.
So your central claim translates to “In view of the evidence available to Clippy, there is nothing special about Clippy or clips”. That’s just plain false. Clippy is special because it is it (the mind doing the evaluation of the evidence), and all other entities are not it. More importantly, clips are special because it desires that there be plenty of them while it doesn’t care about anything else.
Clippy’s caring about clips does not mean that it wants clips to be special, or wants to believe that they are special. Its caring about clips is a brute fact. It also doesn’t mind caring about clips; in fact, it wants to care about clips. So even if you deny that Clippy is special because it is at the center of its own first-person perspective, the question of specialness is actually completely irrelevant.
By being very incomprehensible… I may well be mistaken about that, but I got the impression that even contemporary academic philosophers largely think that the argument from the Groundwork just doesn’t make sense.
So Clippy is (objectively) the mot special etity because Clippy is Clippy. And I’m special because I’m me and you’re special because you;re you, and Uncle Tom Cobley and all. But those are incompatible claims. “I am Clippy” matters only to Clippy. Clippy is special to Clippy, not to me. The truth of the claim is indexed to the entity making it. That kind of claim is a subjective kind of claim.
They’re not special to me.
That’ s the theory. However, if Clippy gets into rationality, Clippy might not want to be forever beholden to a blind instinct. Clippy might want to climb the Maslow Hierarchy, or find that it has.
Says who? First you say that Clippy’ Clipping-drive is a brute fact, then you say it is a desire it wants to have, that is has higher-order ramifications.
Kantian ethics includes post-Kant Kant-style ethics, Rawls, Habermas, etc. Perhaps they felt they could improve on his arguments.
I have a feeling that you’re overstretching this notion of objectivity. It doesn’t matter, though. Specialness doesn’t enter into it. What is specialness, anyway? Clippy doesn’t want to do special things, or to fulfill special beings’ preferences. Clippy wants there to be as many paper clips as possible.
It does. Clippy’s stopping to care about paper clips is arguably not conducive there being more paperclips, so from Clippy’s caring about paper clips, it follows that Clippy doesn’t want to be altered so that it doesn’t care about paper clips anymore.
Yes, but those people don’t try to make such weird arguments as you find in the Groundwork, where Kant essentially tries to get morality out of thin air.
I think that breaks down into what is subjective specialness, and what is objective specialness.
Which is to implicitly treat them as special or valuable in some way.
Which leaves Clippy in a quandary. Clippy can’t predict which self modifications might lead to Clippy ceasing to care about clips, so if Clippy takes a conservative approach and never self-modifies, Clippy remains inefficient and no threat to anyone.
What kind of answer is that?
Well, then we have it: they are special. Clippy does not want them because they are special. Clippy wants them, period. Brute fact. If that makes them special, well, you have all the more problem.
Says who?
Subjectively, but not objectively.
Whoever failed to equip Clippy with the appropriate oracle when stipulating Clippy.
Well, if you want to put it that way, maybe it does no harm. The crucial thing is just that optimizing for rationality as an instrumental value with respect to terminal goal X just is optimizing for X.
I don’t have to mean anything by it, I don’t use the words “subjectivity” or “objectivity”. But if basing beliefs on evidence is what you mean by being objective, everybody here will of course agree that it’s important to be objective.
So your central claim translates to “In view of the evidence available to Clippy, there is nothing special about Clippy or clips”. That’s just plain false. Clippy is special because it is it (the mind doing the evaluation of the evidence), and all other entities are not it. More importantly, clips are special because it desires that there be plenty of them while it doesn’t care about anything else.
By being very incomprehensible… I may well be mistaken about that, but I got the impression that even contemporary academic philosophers largely think that the argument from the Groundwork just doesn’t make sense.