The rudeness wouldn’t help with the downvotes, I can understand that.
But the factor that I was pointing out, and the common factor for my grouping them together was the lack of being able to say “oops”. I am sorry, I didn’t make it very clear. Thus why I listed the assholes with nice people.
MrHen left LessWrong believing in a God, and Mitchell_Porter (as far as I can tell) still believes dualism needs to be true if colour exists (or whatever his argument was, I’m embarrasing myself by trying to simplify it when I had a poor understanding of what he was trying to say).
They were/are also great rationalists apart from that, and they both make sure to be very humble in general while on the site.
The other 3 were often rude, but the main reason I pointed them out was their lack of ability to say “oops” when their rational failings were pointed out to them. Unlike the other two, these 2 them proceeded to act very douchey until friven from the site, but their first posts are much less abrasive and rude.
In general though, if they aren’t going to work out they are wrong at LessWrong, where are they going to?
Some of these people may work it out with time, and it may be unreasonable to expect them to change their mind straight away.
But this should show at least how difficult it is for an irrational person to attempt to become more rational; it’s like having to know the rules to play the rules.
What does it take to commit to wanting rationality from a beginning of irrationality?
These examples show the existence of people on LessWrong who aren’t rational, and while that isn’t a surprise, I feel like the Lesswrong community should be perhaps learn from the failings of some of these people, in order to better react to situations like this in the future, or something. I don’t know.
MrHen left LessWrong believing in a God, and Mitchell_Porter (as far as I can tell) still believes dualism needs to be true if colour exists (or whatever his argument was, I’m embarrasing myself by trying to simplify it when I had a poor understanding of what he was trying to say). They were/are also great rationalists apart from that, and they both make sure to be very humble in general while on the site.
Mitchell_Porter (as far as I can tell) still believes dualism needs to be true if colour exists (or whatever his argument was, I’m embarrasing myself by trying to simplify it when I had a poor understanding of what he was trying to say)
The argument is that for dualism not to be true, we need a new ontology of fundamental quantum monads that no-one else quite gets. :-) My Chalmers-like conclusion that the standard computational theory of mind implies dualism, is an argument against the standard theory.
What does it take to commit to wanting rationality from a beginning of irrationality?
Deciding that being less wrong than I am now is valuable, realizing that doing what I’ve been doing all along is unlikely to get me there, and being willing to give up familiar habits in exchange for alternatives that seem more likely to get me there. These are independently fairly rare and the intersection of them is still more so.
This doesn’t get me to wanting “rationality” per se (let alone to endorsing any specific collection of techniques, assumptions, etc., still less to the specific collection that is most popular on this site), it just gets me looking for some set of tools that is more reliable than the tools I have.
I’ve always understood the initial purpose of LW to be to present a specific collection of tools such that someone who has already decided to look can more easily settle on that specific collection (which, of course, is endorsed by the site founder as particularly useful), at-least-ostensibly in the hope that some of them will subsequently build on it and improve it.
Getting someone who isn’t looking to start looking is a whole different problem, and more difficult on multiple levels (practical, ethical, etc.).
But this should show at least how difficult it is for an irrational person to attempt to become more rational; it’s like having to know the rules to play the rules. What does it take to commit to wanting rationality from a beginning of irrationality?
You need some intial luck. It’s like human mind is a self-modifying system, where the rules can change the rules, and again, and again. Thus human mind is floating around in a mindset space. The original setting is rather fluid, for evolutionary reasons—you should be able to join a different tribe if it becomes essential for your survival. On the other hand, the mindset space contains some attractors; if you happen to have some set of rules, these rules keep preserving themselves. Rationality could be one of these attractors.
Is the inability to update one’s mind really so exceptional on LW? One way of not updating is “blah, blah, blah, I don’t listen to you”. This happens a lot everywhere on the internet, but for these people probably LW is not attractive. The more interesting case is “I listen to you, and I value our discussion, but I don’t update”. This seems paradoxical. But I think it’s actually not unusual… the only unusual thing is the naked form—people who refuse to update, and recognize that they refuse to update. The usual form is that people pretend to update… except that their updates don’t fully propagate. In other words, there is no update, only belief in update. Things like: yeah I agree about Singularity and stuff, but somehow I don’t subscribe for cryopreservation; and I agree human lives are valuable and there are charities which can save hundred human lifes for every dollar sent to them, but somehow I didn’t send a single dollar yet; and I agree that rationality is very important and being strategic can increase one’s utility, and then I procrastinate on LW and other web sites and my everyday life goes on without any changes.
We are so irrational that even our attempts to become rational are horribly irrational, and that’s why they often fail.
The rudeness wouldn’t help with the downvotes, I can understand that.
But the factor that I was pointing out, and the common factor for my grouping them together was the lack of being able to say “oops”. I am sorry, I didn’t make it very clear. Thus why I listed the assholes with nice people.
MrHen left LessWrong believing in a God, and Mitchell_Porter (as far as I can tell) still believes dualism needs to be true if colour exists (or whatever his argument was, I’m embarrasing myself by trying to simplify it when I had a poor understanding of what he was trying to say). They were/are also great rationalists apart from that, and they both make sure to be very humble in general while on the site.
The other 3 were often rude, but the main reason I pointed them out was their lack of ability to say “oops” when their rational failings were pointed out to them. Unlike the other two, these 2 them proceeded to act very douchey until friven from the site, but their first posts are much less abrasive and rude.
In general though, if they aren’t going to work out they are wrong at LessWrong, where are they going to?
Some of these people may work it out with time, and it may be unreasonable to expect them to change their mind straight away.
But this should show at least how difficult it is for an irrational person to attempt to become more rational; it’s like having to know the rules to play the rules.
What does it take to commit to wanting rationality from a beginning of irrationality?
These examples show the existence of people on LessWrong who aren’t rational, and while that isn’t a surprise, I feel like the Lesswrong community should be perhaps learn from the failings of some of these people, in order to better react to situations like this in the future, or something. I don’t know.
In any case, thank you for replying.
Compartmentalization.
Bold statment that somehow still seems true: Most LessWrongers probably have a belief of comparable wrongness. MrHen is just unlucky.
The argument is that for dualism not to be true, we need a new ontology of fundamental quantum monads that no-one else quite gets. :-) My Chalmers-like conclusion that the standard computational theory of mind implies dualism, is an argument against the standard theory.
Deciding that being less wrong than I am now is valuable, realizing that doing what I’ve been doing all along is unlikely to get me there, and being willing to give up familiar habits in exchange for alternatives that seem more likely to get me there. These are independently fairly rare and the intersection of them is still more so.
This doesn’t get me to wanting “rationality” per se (let alone to endorsing any specific collection of techniques, assumptions, etc., still less to the specific collection that is most popular on this site), it just gets me looking for some set of tools that is more reliable than the tools I have.
I’ve always understood the initial purpose of LW to be to present a specific collection of tools such that someone who has already decided to look can more easily settle on that specific collection (which, of course, is endorsed by the site founder as particularly useful), at-least-ostensibly in the hope that some of them will subsequently build on it and improve it.
Getting someone who isn’t looking to start looking is a whole different problem, and more difficult on multiple levels (practical, ethical, etc.).
You need some intial luck. It’s like human mind is a self-modifying system, where the rules can change the rules, and again, and again. Thus human mind is floating around in a mindset space. The original setting is rather fluid, for evolutionary reasons—you should be able to join a different tribe if it becomes essential for your survival. On the other hand, the mindset space contains some attractors; if you happen to have some set of rules, these rules keep preserving themselves. Rationality could be one of these attractors.
Is the inability to update one’s mind really so exceptional on LW? One way of not updating is “blah, blah, blah, I don’t listen to you”. This happens a lot everywhere on the internet, but for these people probably LW is not attractive. The more interesting case is “I listen to you, and I value our discussion, but I don’t update”. This seems paradoxical. But I think it’s actually not unusual… the only unusual thing is the naked form—people who refuse to update, and recognize that they refuse to update. The usual form is that people pretend to update… except that their updates don’t fully propagate. In other words, there is no update, only belief in update. Things like: yeah I agree about Singularity and stuff, but somehow I don’t subscribe for cryopreservation; and I agree human lives are valuable and there are charities which can save hundred human lifes for every dollar sent to them, but somehow I didn’t send a single dollar yet; and I agree that rationality is very important and being strategic can increase one’s utility, and then I procrastinate on LW and other web sites and my everyday life goes on without any changes.
We are so irrational that even our attempts to become rational are horribly irrational, and that’s why they often fail.