In scenario 1 that someone knows me beforehand and optimizes what he says to influence me.
In scenario 2 that someone doesn’t know who will respond, but is optimizing his message to attract specific kinds of people.
The former scenario is a bit worrisome—it’s manipulation. But the latter one looks fairly benign to me—how else would you attract people with a particular set of features? Of course the message is, in some sense, bait but unless it’s poisoned that shouldn’t be a big problem.
I don’t know why scenario 2 should be any less worrisome. The distinction between “optimized for some perception/subset of you” and “optimized for someone like you” is completely meaningless.
Because of degree of focus. It’s like the distinction between a black-hat scanning the entire ’net for vulnerabilities and a black-hat scanning specifically your system for vulnerabilities. Are the two equally worrisome?
equally worrisome, conditional on me having the vulnerability the blackhat is trying to use. This is equivalent to the original warning being conditional on something resonating with you.
If there is a tribal marker, it’s not MWI persay; it’s choosing an interpretation of QM on grounds of explanatory parsimony. Eliezer clearly believed that MWI is the only interpretation of QM that qualifies on such grounds. However, such a belief is quite simply misguided; it ignores several other formulations, including e.g. relational quantum mechanics, the ensemble interpretation, the transactional interpretation, etc. that are also remarkable for their overall parsimony. Someone who advocated for one of these other approaches would be just as recognizable as a member of the rationalist ‘tribe’.
choosing an interpretation of QM on grounds of explanatory parsimony.
contested the strength of the MW claim. Explanatory parsimony doesn’t differentiate a strong from a weak claim
OP’s original claim:
Why does E. Yudkowsky voice such strong priors e.g. wrt. the laws of physics (many worlds interpretation), when much weaker priors seem sufficient for most of his beliefs (e.g. weak computationalism/computational monism) and wouldn’t make him so vulnerable? (With vulnerable I mean that his work often gets ripped apart as cultish pseudoscience.)
A fair point. Maybe I’m committing the typical mind fallacy and underestimating the general gullibility of people. If someone offers you something, it’s obvious to me that you should look for strings, consider the incentives of the giver, and ponder the consequences (including those concerning your mind). If you don’t understand why something is given to you, it’s probably wise to delay grabbing the cheese (or not touching it) until you understand.
And still this all looks to me like a plain-vanilla example of a bootstrapping an organization and creating a base of support, financial and otherwise, for it. Unless you think there were lies, misdirections, or particularly egregious sins of omission, that’s just how the world operates.
Also, anyone who succeeds in attracting people to an enterprise, be it by the most impeccable of means, will find the people they have assembled creating tribal markers anyway. The leader doesn’t have to give out funny hats. People will invent their own.
People do a lot of things. Have biases, for example.
There is quite a bit of our evolutionary legacy it would be wise to deemphasize. Not like there aren’t successful examples of people doing good work in common and not being a tribe.
edit: I think what’s going on is a lot of the rationalist tribe folks are on the spectrum and/or “nerdy”, and thus have a more difficult time forming communities, and LW/etc was a great way for them to get something important in their life. They find it valuable and rightly so. They don’t want to give it up.
I am sympathetic to this, but I think it would be wise to separate the community aspects and rationality itself as a “serious business.” Like, I am friends with lots of academics, but the academic part of our relationship has to be kept separate (I would rip into their papers in peer review, etc.) The guru/disciple dynamic I think is super unhealthy.
Because warning against dark side rationality with dark side rationality to find light side rationalists doesn’t look good against the perennial c-word claims against LW...
Why?
Consider that if stuff someone says resonates with you, that someone is optimizing for that.
There are two quite different scenarios here.
In scenario 1 that someone knows me beforehand and optimizes what he says to influence me.
In scenario 2 that someone doesn’t know who will respond, but is optimizing his message to attract specific kinds of people.
The former scenario is a bit worrisome—it’s manipulation. But the latter one looks fairly benign to me—how else would you attract people with a particular set of features? Of course the message is, in some sense, bait but unless it’s poisoned that shouldn’t be a big problem.
I don’t know why scenario 2 should be any less worrisome. The distinction between “optimized for some perception/subset of you” and “optimized for someone like you” is completely meaningless.
Because of degree of focus. It’s like the distinction between a black-hat scanning the entire ’net for vulnerabilities and a black-hat scanning specifically your system for vulnerabilities. Are the two equally worrisome?
equally worrisome, conditional on me having the vulnerability the blackhat is trying to use. This is equivalent to the original warning being conditional on something resonating with you.
MIRI survives in part via donations from people who bought the party line on stuff like MWI.
Are you saying that based on having looked at the data? I think we should have a census that has numbers about donations for MIRI and belief in MWI.
Really, you would want MWI belief delta (to before they found LW) to measure “bought the party line.”
I am not trying to emphasize MWI specifically, it’s the whole set of tribal markers together.
If there is a tribal marker, it’s not MWI persay; it’s choosing an interpretation of QM on grounds of explanatory parsimony. Eliezer clearly believed that MWI is the only interpretation of QM that qualifies on such grounds. However, such a belief is quite simply misguided; it ignores several other formulations, including e.g. relational quantum mechanics, the ensemble interpretation, the transactional interpretation, etc. that are also remarkable for their overall parsimony. Someone who advocated for one of these other approaches would be just as recognizable as a member of the rationalist ‘tribe’.
contested the strength of the MW claim. Explanatory parsimony doesn’t differentiate a strong from a weak claim
OP’s original claim:
A fair point. Maybe I’m committing the typical mind fallacy and underestimating the general gullibility of people. If someone offers you something, it’s obvious to me that you should look for strings, consider the incentives of the giver, and ponder the consequences (including those concerning your mind). If you don’t understand why something is given to you, it’s probably wise to delay grabbing the cheese (or not touching it) until you understand.
And still this all looks to me like a plain-vanilla example of a bootstrapping an organization and creating a base of support, financial and otherwise, for it. Unless you think there were lies, misdirections, or particularly egregious sins of omission, that’s just how the world operates.
Also, anyone who succeeds in attracting people to an enterprise, be it by the most impeccable of means, will find the people they have assembled creating tribal markers anyway. The leader doesn’t have to give out funny hats. People will invent their own.
People do a lot of things. Have biases, for example.
There is quite a bit of our evolutionary legacy it would be wise to deemphasize. Not like there aren’t successful examples of people doing good work in common and not being a tribe.
edit: I think what’s going on is a lot of the rationalist tribe folks are on the spectrum and/or “nerdy”, and thus have a more difficult time forming communities, and LW/etc was a great way for them to get something important in their life. They find it valuable and rightly so. They don’t want to give it up.
I am sympathetic to this, but I think it would be wise to separate the community aspects and rationality itself as a “serious business.” Like, I am friends with lots of academics, but the academic part of our relationship has to be kept separate (I would rip into their papers in peer review, etc.) The guru/disciple dynamic I think is super unhealthy.
Because warning against dark side rationality with dark side rationality to find light side rationalists doesn’t look good against the perennial c-word claims against LW...