hmm, another positive reference to Buddhism..
I’m personally biased against in all of it’s versions, more than I am of say christianity etc-IMO it does not deserve all the praise/advertisement it’s been getting on LW of late, and my bias aganst it is confirmed by the ease with which it has suddenly creeped up LW.
as a rationalist-not technophile/libertarian etc but as one who seeks to be more rational, do you seriously believe in what Buddhism preaches? all of it?
if you’re going to cherry pick then why call it Buddhism and praise it so? I fail to see this as being “less wrong” in any way, maybe I just don’t get it, and if so I would greatly appreciate a simple and rational explanation of why I-or Eliezer or anyone else-should take “good” Buddhism seriously in our pursuit of rationality?
my problem is mainly, the attachment of Buddhist teaching with ‘meditation’-which seems to be universal and not only a Buddhist practice-of some value but not more than say studying human bias or generally reading the average LW top post.
as a rationalist-not technophile/libertarian etc but as one who seeks to be more rational, do you seriously believe in what Buddhism preaches? all of it?
The fact that you ask this question is strong evidence you are being careless. You assume stupidity and are self-satisfied. You will never be a strong rationalist this way. You need to cultivate a sense that much more is possible.
if you’re going to cherry pick then why call it Buddhism and praise it so?
Did not praise. I know that you know that your assumptions are mostly rhetorical. Still dangerous. Carelessness. Not moving in harmony with the Bayes. Begging for confirmation, is this disposition of assumption. You will be pulled off course by this. These simple skills of rationality must be perfected if one is to build very strong rationality, with very complex skills. Necessary if you are to use all of your cognitive aspects and limitations to achieve all that is possible. Only possible to use limitation of affective thoughts for good after one is very consistently strong rationalist. Must be able to hold a very steady course in mindspace, in conceptspace, in identityspace, before one can try to use powerful attractors like affect to accelerate along that course. Less Wrong folk cannot do this consistently. Almost no one can. Enlightened people, mostly; maybe others from other disciplines that I know not. I cannot yet do so. Perhaps not far, though.
I would greatly appreciate a simple and rational explanation of why I-or Eliezer or anyone else-should take “good” Buddhism seriously in our pursuit of rationality?
Less Wrong is not really worth my time, except as providing a motivation to write. The epistemological gap between Less Wrong and me is growing too wide. Eliezer I may talk to next time he’s around, I guess. The epistemological gap between Eliezer and me is growing narrower. Still many levels above me is Eliezer, but I think only 2.2 levels or so. Easily surmountable with recursive self-improvement.
my problem is mainly, the attachment of Buddhist teaching with ‘meditation’-which seems to be universal and not only a Buddhist practice-of some value but not more than say studying human bias or generally reading the average LW top post.
We do not live in Gautama’s time. Almost all of Theravada is true, but most is not relevant for rationalists of my caliber.
Virtues that he preached, we mostly have now. Smart people are cultured enough to have these virtues and understand their motivations. Evolutionary psychology and cultivated compassion. So virtue part of Buddhism, not as important, I think.
Community part of Buddhism, sangha, very important; but having a peer group of strong rationalists intent on leveling up, is its own sangha, and one better than any that could have arisen almost anywhere in the past.
Mindfulness, insight, concentration, self-control; these are the third branch of Buddhism, and the part Less Wrong needs. Thus I focus on that part. I know about human bias. I have read nearly every Less Wrong post. But understanding the algorithms behind human bias from the inside, feeling the qualia of cognitive subsystems, building those qualia, being mindful of attractors in mindspace; these are important skills for a rationalist. Knowledge of a bias is a knowledge. Important one, but so very limited. Having the disposition of feeling biases as pulls on cognition, at all levels of complexity of cognitive algorithms: this is much stronger skill. Necessary to become superintelligence. Which is everyone’s desire, no? It should be, I think. Would be their desire if they knew more, thought faster, were wiser and more compassionate. Meditation, not only way of building this skill. Just oldest and most studied one. Many have tread this path and passed on knowledge. The skills of good computational cognitive scientist, also strong. But no one writes about these skills. I think becoming superintelligent probably not possible without them. But meditation builds similar skills, and is close. Both are metaskills. Epistemological bootstrapping mechanisms. There are meta-metaskills for this. Well, there aren’t yet, but I am building one, and others are building some. We sense that more is possible. Buddhism is silly. Meditation, less silly, but not important of itself. Just one metaskill. Soon more will be possible. Not sure if it will trickle down to Less Wrong. Probably not. The gap is too wide.
ok, I was careless, I apologize, still the argument remains unanswered satisfactorily..
my-and others’-main argument against meditation as a rationality increasing tool is that the less than perfect brains we have are not sufficient at dealing with biases and so forth. I can see that you’ve pretty much said the same or close to it in your reply above, so that’s that.
hmm, another positive reference to Buddhism.. I’m personally biased against in all of it’s versions, more than I am of say christianity etc-IMO it does not deserve all the praise/advertisement it’s been getting on LW of late, and my bias aganst it is confirmed by the ease with which it has suddenly creeped up LW.
as a rationalist-not technophile/libertarian etc but as one who seeks to be more rational, do you seriously believe in what Buddhism preaches? all of it?
if you’re going to cherry pick then why call it Buddhism and praise it so? I fail to see this as being “less wrong” in any way, maybe I just don’t get it, and if so I would greatly appreciate a simple and rational explanation of why I-or Eliezer or anyone else-should take “good” Buddhism seriously in our pursuit of rationality?
my problem is mainly, the attachment of Buddhist teaching with ‘meditation’-which seems to be universal and not only a Buddhist practice-of some value but not more than say studying human bias or generally reading the average LW top post.
The fact that you ask this question is strong evidence you are being careless. You assume stupidity and are self-satisfied. You will never be a strong rationalist this way. You need to cultivate a sense that much more is possible.
Did not praise. I know that you know that your assumptions are mostly rhetorical. Still dangerous. Carelessness. Not moving in harmony with the Bayes. Begging for confirmation, is this disposition of assumption. You will be pulled off course by this. These simple skills of rationality must be perfected if one is to build very strong rationality, with very complex skills. Necessary if you are to use all of your cognitive aspects and limitations to achieve all that is possible. Only possible to use limitation of affective thoughts for good after one is very consistently strong rationalist. Must be able to hold a very steady course in mindspace, in conceptspace, in identityspace, before one can try to use powerful attractors like affect to accelerate along that course. Less Wrong folk cannot do this consistently. Almost no one can. Enlightened people, mostly; maybe others from other disciplines that I know not. I cannot yet do so. Perhaps not far, though.
Less Wrong is not really worth my time, except as providing a motivation to write. The epistemological gap between Less Wrong and me is growing too wide. Eliezer I may talk to next time he’s around, I guess. The epistemological gap between Eliezer and me is growing narrower. Still many levels above me is Eliezer, but I think only 2.2 levels or so. Easily surmountable with recursive self-improvement.
We do not live in Gautama’s time. Almost all of Theravada is true, but most is not relevant for rationalists of my caliber.
Virtues that he preached, we mostly have now. Smart people are cultured enough to have these virtues and understand their motivations. Evolutionary psychology and cultivated compassion. So virtue part of Buddhism, not as important, I think.
Community part of Buddhism, sangha, very important; but having a peer group of strong rationalists intent on leveling up, is its own sangha, and one better than any that could have arisen almost anywhere in the past.
Mindfulness, insight, concentration, self-control; these are the third branch of Buddhism, and the part Less Wrong needs. Thus I focus on that part. I know about human bias. I have read nearly every Less Wrong post. But understanding the algorithms behind human bias from the inside, feeling the qualia of cognitive subsystems, building those qualia, being mindful of attractors in mindspace; these are important skills for a rationalist. Knowledge of a bias is a knowledge. Important one, but so very limited. Having the disposition of feeling biases as pulls on cognition, at all levels of complexity of cognitive algorithms: this is much stronger skill. Necessary to become superintelligence. Which is everyone’s desire, no? It should be, I think. Would be their desire if they knew more, thought faster, were wiser and more compassionate. Meditation, not only way of building this skill. Just oldest and most studied one. Many have tread this path and passed on knowledge. The skills of good computational cognitive scientist, also strong. But no one writes about these skills. I think becoming superintelligent probably not possible without them. But meditation builds similar skills, and is close. Both are metaskills. Epistemological bootstrapping mechanisms. There are meta-metaskills for this. Well, there aren’t yet, but I am building one, and others are building some. We sense that more is possible. Buddhism is silly. Meditation, less silly, but not important of itself. Just one metaskill. Soon more will be possible. Not sure if it will trickle down to Less Wrong. Probably not. The gap is too wide.
ok, I was careless, I apologize, still the argument remains unanswered satisfactorily..
my-and others’-main argument against meditation as a rationality increasing tool is that the less than perfect brains we have are not sufficient at dealing with biases and so forth. I can see that you’ve pretty much said the same or close to it in your reply above, so that’s that.
P.S disjointed sentences?