Basically, you are being conditioned to feel good about bad approach to reasoning—where you make huge jumps, where you don’t note the assumptions you make, where you just make invalid assumption, where you don’t search for faults, etc., and feel bad about good approach to reasoning.
My initial response to this was “that seems completely untrue,” so I decided to hunt for examples. I think you’re right, because I was able to come up with an example of myself doing this, namely downloading music and movies for free from the Internet. I do consider this kind-of-vaguely-like-stealing, but the “kind-of-vaguely” part is a good indication that my thinking is deliberately fuzzy in this area.
When I think about it, I don’t know why–I don’t consume enough entertainment materials that paying for it would be a significant pull on my finances, and I’m hardly financially strapped. I think it’s because the usual strong positive reinforcement I would get for knowing I was “doing the right thing” despite wanting Thing X really badly is outweighed by the knowledge that several of my friends would make fun of me for paying for stuff on iTunes. Which...if I think about it...is also a pretty selfish reason!
You may just have convinced me that I should start paying for my music and movies, as a way of training my moral thinking to be less “sloppy”!
You may just have convinced me that I should start paying for my music and movies, as a way of training my moral thinking to be less “sloppy”!
Heh. But why did I do that? Selfish motives also (I make software for living).
I came up with another example. Consider the sunk cost issue. Suppose that you spent years working on a project that is heading nowhere, the effort was wasted, and there’s a logical way to see that it is wasted effort. Any time your thought wavers in the direction of understanding that the effort was wasted, you get stab of negative emotions—particular hormones are released into bloodstream, particular pathways activate—and that is negative reinforcement for everything you’ve been doing including the use of mental framework that did lead you to that thought. I think LW calls something similar an ‘ugh field’, except the issue is that reinforcement is not so specific in it’s action as to make you avoid one specific thought without also making you avoid the very method of thinking that got you there.
I think it may help in general (to combat the induced sloppiness) to do some kind of work where you are reliably negatively reinforced for being wrong or sloppy. Studying mathematics and doing the exercises correctly can be useful. (Studying without exercises doesn’t even work). Software development, also. This will build a skill of what to do not to be sloppy, but won’t necessarily transfer onto moral reasoning, for skill to transfer something else may be needed.
Consider the sunk cost issue. Suppose that you spent years working on a project that is heading nowhere, the effort was wasted, and there’s a logical way to see that it is wasted effort. Any time your thought wavers in the direction of understanding that the effort was wasted, you get stab of negative emotions—particular hormones are released into bloodstream, particular pathways activate—and that is negative reinforcement for everything you’ve been doing including the use of mental framework that did lead you to that thought.
Solution: have a community where you can gain respect and status by having successfully noticed and avoided sunk cost reasoning. LW isn`t the best possible example of such a community, but a lot of the exercises done at, say, the summer minicamps in San Francisco were subsets of “get positive reinforcement for noticing Irrational Thought Pattern X in yourself, when normally various kinds of cognitive dissonance would make it tempting to sort of vaguely not notice it.”
I had read that article before. It’s not something that I would consider a problem for myself...I rarely if ever abandon a project in the middle, and when I do, it’s a) always been a personal project or goal that affects no one else, and b) always been something that turned out to be either a bad idea in the first place (i.e. my goal at age 14 of weighing 110 pounds...would never happen unless I actually develop an eating disorder), or important to me for the wrong reasons (going to the Olympics for swimming). Etc.
Note that this isn’t any kind of argument against your point… If anything, it’s my own personal failure mode of assuming everyone’s brain is like mine and that their main problems are like mine.
However, I think it does count for something that nyan_sandwich posted this article, noticing a flaw in his reasoning, on LW...and got upvotes and praise.
LW is a terrible example, an attachment to bunch of people (SI) who keep sinking their effort and other people’s money, and rationalizing it. Regarding noticing irrational pattern, so you notice it, get rid of it, then what? You aren’t gaining some incredible powers of finding correct answer (you’ll just come up with something else that’s wrong). It’s something you always find in cults—thought reform, unlearn what you learnt style. You don’t find people sitting at the desks doing math exercises all day being ranked for being correct, being taught how to be correct, that would be school/university course, it is boring, it’s no silver bullet, it takes time.
LW is a terrible example, an attachment to bunch of people (SI) who keep sinking their effort and other people’s money, and rationalizing it. Regarding noticing irrational pattern, so you notice it, get rid of it, then what? You aren’t gaining some incredible powers of finding correct answer.
Might I suggest using fungibility? There are more effective ways than LW to treat boredom and desire for unusual conversation, if you pursue them separately.
My initial response to this was “that seems completely untrue,” so I decided to hunt for examples. I think you’re right, because I was able to come up with an example of myself doing this, namely downloading music and movies for free from the Internet. I do consider this kind-of-vaguely-like-stealing, but the “kind-of-vaguely” part is a good indication that my thinking is deliberately fuzzy in this area.
When I think about it, I don’t know why–I don’t consume enough entertainment materials that paying for it would be a significant pull on my finances, and I’m hardly financially strapped. I think it’s because the usual strong positive reinforcement I would get for knowing I was “doing the right thing” despite wanting Thing X really badly is outweighed by the knowledge that several of my friends would make fun of me for paying for stuff on iTunes. Which...if I think about it...is also a pretty selfish reason!
You may just have convinced me that I should start paying for my music and movies, as a way of training my moral thinking to be less “sloppy”!
Heh. But why did I do that? Selfish motives also (I make software for living).
I came up with another example. Consider the sunk cost issue. Suppose that you spent years working on a project that is heading nowhere, the effort was wasted, and there’s a logical way to see that it is wasted effort. Any time your thought wavers in the direction of understanding that the effort was wasted, you get stab of negative emotions—particular hormones are released into bloodstream, particular pathways activate—and that is negative reinforcement for everything you’ve been doing including the use of mental framework that did lead you to that thought. I think LW calls something similar an ‘ugh field’, except the issue is that reinforcement is not so specific in it’s action as to make you avoid one specific thought without also making you avoid the very method of thinking that got you there.
I think it may help in general (to combat the induced sloppiness) to do some kind of work where you are reliably negatively reinforced for being wrong or sloppy. Studying mathematics and doing the exercises correctly can be useful. (Studying without exercises doesn’t even work). Software development, also. This will build a skill of what to do not to be sloppy, but won’t necessarily transfer onto moral reasoning, for skill to transfer something else may be needed.
Solution: have a community where you can gain respect and status by having successfully noticed and avoided sunk cost reasoning. LW isn`t the best possible example of such a community, but a lot of the exercises done at, say, the summer minicamps in San Francisco were subsets of “get positive reinforcement for noticing Irrational Thought Pattern X in yourself, when normally various kinds of cognitive dissonance would make it tempting to sort of vaguely not notice it.”
This has its own failure mode.
I had read that article before. It’s not something that I would consider a problem for myself...I rarely if ever abandon a project in the middle, and when I do, it’s a) always been a personal project or goal that affects no one else, and b) always been something that turned out to be either a bad idea in the first place (i.e. my goal at age 14 of weighing 110 pounds...would never happen unless I actually develop an eating disorder), or important to me for the wrong reasons (going to the Olympics for swimming). Etc.
Note that this isn’t any kind of argument against your point… If anything, it’s my own personal failure mode of assuming everyone’s brain is like mine and that their main problems are like mine.
However, I think it does count for something that nyan_sandwich posted this article, noticing a flaw in his reasoning, on LW...and got upvotes and praise.
LW is a terrible example, an attachment to bunch of people (SI) who keep sinking their effort and other people’s money, and rationalizing it. Regarding noticing irrational pattern, so you notice it, get rid of it, then what? You aren’t gaining some incredible powers of finding correct answer (you’ll just come up with something else that’s wrong). It’s something you always find in cults—thought reform, unlearn what you learnt style. You don’t find people sitting at the desks doing math exercises all day being ranked for being correct, being taught how to be correct, that would be school/university course, it is boring, it’s no silver bullet, it takes time.
Why are you here then? Please leave.
Are you intentionally trying to promote evaporative cooling?
Evaporative cooling regarding that attitude and this behavioral pattern? ABSOULTELY!
Boredom. You guys are highly unusual, have to give you that.
Might I suggest using fungibility? There are more effective ways than LW to treat boredom and desire for unusual conversation, if you pursue them separately.