Nobody asked me to take either vow. Doing so isn’t in the spirit of this community.
I believe that’s why So8res referred to it as a vow to yourself, not anyone else. Also note that this is a series of posts meant to introduce people to Rationality: AI to Zombies, not “this community” (by which I assume you mean LW).
There nothing wrong with willing reality to be different. It leads to actions that change reality.
This seems like a willful misreading of the essay’s point. It seems obvious from context that So8res is referring here to motivated cognition, which does indeed have something wrong with it.
I believe that’s why So8res referred to it as a vow to yourself, not anyone else.
Before I also haven’t heard anybody speak about taking those kinds of vows to oneself.
This seems like a willful misreading of the essay’s point. It seems obvious from context that So8res is referring here to motivated cognition, which does indeed have something wrong with it.
I consider basics to be important. If we allow vague statements about basic principles of rationality to stand we don’t improve our understanding of rationality.
Willing is not the problem of motivated cognition. Having desires for reality to be different is not the problem. You don’t need to be a straw vulcan without any desire or will to be rational.
Furthermore “Shut up and do the impossible” from the sequences is about “trying to will reality into being a certain way”.
I think the “The Twelve Virtues of Rationality” actually makes an argument that those things are virtues.
It’s start is also quite fitting: “The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth.”
It argues against the frame of vows.
Withdrawing into mysticism where everything goes is bad. Obfuscating is bad. It’s quite easy to say something that gives rationalist applause lights. Critical thinking and actually thinking through the implications of using the frame of a vow is harder. Getting less wrong about what it happens to think rational is hard.
Mystic writing that’s too vague to be questioned doesn’t really have a place here.
The fact that I haven’t taken a literal vow is true, but they meaning of what I was saying goes beyond that point.
The root is that nobody asked me in a metaphorical way to take a vow either. Eliezer asked for curiosity instead of a solemn vow in the talk about rationalist virtues.
The root is that nobody asked me in a metaphorical way to take a vow either.
Er, yes, someone has. In fact, Eliezer has asked you to do so. From the Twelve Virtues:
The third virtue is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can. Do this the instant you realize what you are resisting; the instant you can see from which quarter the winds of evidence are blowing against you. Be faithless to your cause and betray it to a stronger enemy. If you regard evidence as a constraint and seek to free yourself, you sell yourself into the chains of your whims. For you cannot make a true map of a city by sitting in your bedroom with your eyes shut and drawing lines upon paper according to impulse. You must walk through the city and draw lines on paper that correspond to what you see. If, seeing the city unclearly, you think that you can shift a line just a little to the right, just a little to the left, according to your caprice, this is just the same mistake.)
This is the exact same thing that the article is saying:
In order to study the art of human rationality, one must make a solemn pact with themselves. They must vow to stop trying to will reality into being a certain way; they must vow to instead listen to reality tell them how it is.
Furthermore “Shut up and do the impossible” from the sequences is about “trying to will reality into being a certain way”.
No, it’s about actually finding the way to force reality into some state others considered so implausible that they hastily labeled it impossible. Saying, “If the probability isn’t 0%, then to me it’s as good as 100%!” isn’t saying you can defy probability, but merely that you have a lot of information and compute-power. Or it might even just be expressing a lot of emotional confidence for someone else’s sake.
(Or that you can solve your problems with giant robots, which is always the awesomer option.)
This is what is known as “neglecting context”. Right after the sentence you originally quoted from the article, we see this:
They must recognize “faith” as an attempt to disconnect their beliefs from the voice of the evidence; they must vow to protect the ephemeral correspondence between the real world and their map of it.
I’m not quite sure why you’re having difficulty understanding this. “Willing reality into a being a certain way”, in this context, does not mean desiring to change the world, but rather shifting one’s probability estimates toward one’s desired conclusion. For example, I have a strong preference that UFAI not be created. However, it would be a mistake for me to then assign a 0.00001% probability to the creation of UFAI purely because I don’t want it to be created; the true probability is going to be higher than that. I might work harder to stop the creation of UFAI, which is what you mean by “willing reality”, but that is clearly not the meaning the article is using.
I believe that’s why So8res referred to it as a vow to yourself, not anyone else. Also note that this is a series of posts meant to introduce people to Rationality: AI to Zombies, not “this community” (by which I assume you mean LW).
This seems like a willful misreading of the essay’s point. It seems obvious from context that So8res is referring here to motivated cognition, which does indeed have something wrong with it.
Before I also haven’t heard anybody speak about taking those kinds of vows to oneself.
I consider basics to be important. If we allow vague statements about basic principles of rationality to stand we don’t improve our understanding of rationality.
Willing is not the problem of motivated cognition. Having desires for reality to be different is not the problem. You don’t need to be a straw vulcan without any desire or will to be rational.
Furthermore “Shut up and do the impossible” from the sequences is about “trying to will reality into being a certain way”.
It’s not literal. It’s an attempt at poetic language, like The Twelve Virtues of Rationality.
I think the “The Twelve Virtues of Rationality” actually makes an argument that those things are virtues.
It’s start is also quite fitting: “The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth.”
It argues against the frame of vows.
Withdrawing into mysticism where everything goes is bad. Obfuscating is bad. It’s quite easy to say something that gives rationalist applause lights. Critical thinking and actually thinking through the implications of using the frame of a vow is harder. Getting less wrong about what it happens to think rational is hard.
Mystic writing that’s too vague to be questioned doesn’t really have a place here.
Sure, I agree with all of that. I was just trying to get at the root of why “nobody asked [you] to take either vow”.
The fact that I haven’t taken a literal vow is true, but they meaning of what I was saying goes beyond that point.
The root is that nobody asked me in a metaphorical way to take a vow either. Eliezer asked for curiosity instead of a solemn vow in the talk about rationalist virtues.
There are reasons why that’s the case.
Er, yes, someone has. In fact, Eliezer has asked you to do so. From the Twelve Virtues:
This is the exact same thing that the article is saying:
No, it’s about actually finding the way to force reality into some state others considered so implausible that they hastily labeled it impossible. Saying, “If the probability isn’t 0%, then to me it’s as good as 100%!” isn’t saying you can defy probability, but merely that you have a lot of information and compute-power. Or it might even just be expressing a lot of emotional confidence for someone else’s sake.
(Or that you can solve your problems with giant robots, which is always the awesomer option.)
The sentence “trying to will reality into being a certain way”. doesn’t say anything about p=0 or defying probability.
This is what is known as “neglecting context”. Right after the sentence you originally quoted from the article, we see this:
I’m not quite sure why you’re having difficulty understanding this. “Willing reality into a being a certain way”, in this context, does not mean desiring to change the world, but rather shifting one’s probability estimates toward one’s desired conclusion. For example, I have a strong preference that UFAI not be created. However, it would be a mistake for me to then assign a 0.00001% probability to the creation of UFAI purely because I don’t want it to be created; the true probability is going to be higher than that. I might work harder to stop the creation of UFAI, which is what you mean by “willing reality”, but that is clearly not the meaning the article is using.