Does that mean people for whom rationalism is a near-terminal goal that cannot become a lost purpose? Do you use “rationalism” somewhat like the way Charlie Sheen might use “winning”, as “rationalism” is often used here?
If yes and no, then to what end rationalism?
If yes and yes, then you value that for someone who, in relation to various things, wants them, that they have as a cherished thing achieving their own wants? Is people having such a nearly-terminal goal a correspondingly deep value of yours, or is it more instrumental? Either way, is coming to value that one of the smaller changes I could make to turn my values towards consistency (and how much more change than coming to consciously value it would I have to do if it is not emergent from my existing values)? If so, at what level would I be valuing that, presumably the same as you do, no? It isn’t enough to have a passing devotion to wanting that, that which I want, I should get it?
If this is unclear or badly off-target, let it indicate the magnitude of my confusion as to what you meant.
[09:28] Eliezer: if I had to take a real-world action, like, guessing someone’s name with a gun to my head [09:29] Eliezer: if I had to choose it would suddenly become very relevant that I knew Michael was one of the most statistically common names, but couldn’t remember for which years it was the most common, and that I knew Michael was more likely to be a male name than a female name [09:29] Eliezer: if an alien had a gun to its head, telling it “I don’t know” at this point would not be helpful [09:29] Eliezer: because there’s a whole lot I know that it doesn’t [09:30] X: ok [09:33] X: what about a question for which you really don’t have any information? [09:33] X: like something only an alien would know [09:34] Eliezer: if I have no evidence I use an appropriate Ignorance Prior, which distributes probability evenly across all possibilities, and assigns only a very small amount to any individual possibility because there are so many [09:35] Eliezer: if the person I’m talking to already knows to use an ignorance prior, I say “I don’t know” because we already have the same probability distribution and I have nothing to add to that [09:35] Eliezer: the ignorance prior tells me my betting odds [09:35] Eliezer: it governs my choices [09:35] X: and what if you don’t know how to use an ignorance prior [09:36] X: have never heard of it etc [09:36] Eliezer: if I’m dealing with someone who doesn’t know about ignorance priors, and who is dealing with the problem by making up this huge elaborate hypothesis with lots of moving parts and many places to go wrong, then the truth is that I automatically know s/he’s wrong [09:36] Eliezer: it may not be possible to explain this to them, short of training them from scratch in rationality [09:36] Eliezer: but it is true [09:36] Eliezer: and if the person trusts me for a rationalist, it may be both honest and helpful to tell them, “No, that’s wrong”
Eliezer obviously wouldn’t be telling them to shut up and be evil; he’d be intending to tell the person he’d infer he was talking to that, and if this “rare person” couldn’t learn about Eliezer’s actual intent by inferring the message Eliezer had intended to communicate to who Eliezer ought to have thought he was talking to, the person would be rarely dense.
So that part of Eliezer’s message is not flawed, so I’m not sure why you thought it needed addressing.
This assumes I’m reading this post correctly, something I’m not confident of.
Does that mean people for whom rationalism is a near-terminal goal that cannot become a lost purpose?
Maybe in some way, but not in the way that you interpret it to mean… I emphasize the importance of noticing lost purposes, which is central to both epistemic and instrumental rationality. Elsewhere in this thread I re-wrote the post without the cool links, if you’re interested in figuring out what I originally meant. I apologize for the vagueness.
As for your second critique, I’m not claiming that Eliezer’s message is particularly flawed, just suggesting an improvement over current norms of which Eliezer’s original message could be taken as partially representative, even if it makes perfect sense in context. That is, Eliezer’s message isn’t really important to the point of the post and can be ignored.
Sunzi said: The art of war is of vital importance to the State. It is a matter of life and death, a road either to safety or to ruin. Hence it is a subject of inquiry which can on no account be neglected. The art of war, then, is governed by five constant factors, to be taken into account in one’s deliberations, when seeking to determine the conditions obtaining in the field. These are: (1) The Moral Law; (2) Heaven; (3) Earth; (4) The Commander; (5) Method and discipline. The Moral Law causes the people to be in complete accord with their ruler, so that they will follow him regardless of their lives, undismayed by any danger.
The very first factor in the very first chapter of The Art of War is about the importance of synchronous goals between agents and represented. It is instrumental in preserving the state. It is also instrumental in preserving the state (sic).
Sun Tzu replied: “Having once received His Majesty’s commission to be the general of his forces, there are certain commands of His Majesty which, acting in that capacity, I am unable to accept.” Accordingly, he had the two leaders beheaded, and straightway installed the pair next in order as leaders in their place.
A metaphor.
ridiculously strong aversion
The iron is hot, some feel fear.
just suggesting an improvement
You aren’t though.
You’re expressing belief in a possible downside of current practice. We can say, unconditionally and flatly, that it is a downside, if real, without it being right to minimize that downside. To your credit, you also argue that effects on the average influenced person are less valuable than is generally thought, which if true would be a step towards indicating a change in policy would be good.
But beyond that, you don’t articulate what would be a superior policy, and you have a lot of intermediary conclusions to establish to make a robust criticism.
Does that mean people for whom rationalism is a near-terminal goal that cannot become a lost purpose? Do you use “rationalism” somewhat like the way Charlie Sheen might use “winning”, as “rationalism” is often used here?
If yes and no, then to what end rationalism?
If yes and yes, then you value that for someone who, in relation to various things, wants them, that they have as a cherished thing achieving their own wants? Is people having such a nearly-terminal goal a correspondingly deep value of yours, or is it more instrumental? Either way, is coming to value that one of the smaller changes I could make to turn my values towards consistency (and how much more change than coming to consciously value it would I have to do if it is not emergent from my existing values)? If so, at what level would I be valuing that, presumably the same as you do, no? It isn’t enough to have a passing devotion to wanting that, that which I want, I should get it?
If this is unclear or badly off-target, let it indicate the magnitude of my confusion as to what you meant.
This comes to mind.
Eliezer obviously wouldn’t be telling them to shut up and be evil; he’d be intending to tell the person he’d infer he was talking to that, and if this “rare person” couldn’t learn about Eliezer’s actual intent by inferring the message Eliezer had intended to communicate to who Eliezer ought to have thought he was talking to, the person would be rarely dense.
So that part of Eliezer’s message is not flawed, so I’m not sure why you thought it needed addressing.
This assumes I’m reading this post correctly, something I’m not confident of.
Maybe in some way, but not in the way that you interpret it to mean… I emphasize the importance of noticing lost purposes, which is central to both epistemic and instrumental rationality. Elsewhere in this thread I re-wrote the post without the cool links, if you’re interested in figuring out what I originally meant. I apologize for the vagueness.
As for your second critique, I’m not claiming that Eliezer’s message is particularly flawed, just suggesting an improvement over current norms of which Eliezer’s original message could be taken as partially representative, even if it makes perfect sense in context. That is, Eliezer’s message isn’t really important to the point of the post and can be ignored.
The very first factor in the very first chapter of The Art of War is about the importance of synchronous goals between agents and represented. It is instrumental in preserving the state. It is also instrumental in preserving the state (sic).
Even so,
A metaphor.
The iron is hot, some feel fear.
You aren’t though.
You’re expressing belief in a possible downside of current practice. We can say, unconditionally and flatly, that it is a downside, if real, without it being right to minimize that downside. To your credit, you also argue that effects on the average influenced person are less valuable than is generally thought, which if true would be a step towards indicating a change in policy would be good.
But beyond that, you don’t articulate what would be a superior policy, and you have a lot of intermediary conclusions to establish to make a robust criticism.
Correct, I was imprecise. I’m listing a downside and listing nonobvious considerations that make it more of a downside than might be assumed.