I see it as an applause light. On Less Wrong, talking about consciousness arising from physical structures is a way to show your allegiance to reductionism and physicalism, and you can hardly go wrong doing that.
The reason I’m being a pain about this sort of stuff is that I have an allergy to signs that our tribal beliefs—atheism, physicalism, MWI, memetics, reductionism etc—are being used as synonyms for rationality. They’re just ideas, theories. Even if they’re all correct (I’m not sure on some of them), if reality were different, they’d be wrong. So they’re hardly the essence, or key ideas, of rationality, the way map/territory, evidence, Bayes Theorem etc can be said to be key concepts of rationality.
“The whole idea of a unified universe with mathematically regular laws, that was what had been flushed down the toilet; the whole notion of physics. Three thousand years of resolving big complicated things into smaller pieces, discovering that the music of the planets was the same tune as a falling apple, finding that the true laws were perfectly universal and had no exceptions anywhere and took the form of simple math governing the smallest parts, not to mention that the mind was the brain and the brain was made of neurons, a brain was what a person was -
And then a woman turned into a cat, so much for all that.
“Right,” Harry said, somewhat dazed. He pulled his thoughts together. The March of Reason would just have to start over, that was all; they still had the experimental method and that was the important thing.”
I quoted that because it illustrates my point: reality is “somehow separate from my very best hypotheses”, all those specific “rational” beliefs might need to be abandoned at some point, and the methods of rationality would still apply.
So when merely professing these beliefs is somehow taken as imparting a lesson on rationality (like when a quote about memetics gets upvoted as a rationality quote, even though it carries no rationality lesson in itself), something feels wrong to me.
The reason I’m being a pain about this sort of stuff is that I have an allergy to signs that our tribal beliefs—atheism, physicalism, MWI, memetics, reductionism etc—are being used as synonyms for rationality.
Agreed. To pick a more extreme example:
Tribal beliefs here seem to also include cryonics, life-extensionism, and immortalism.
While I agree with these as desirable goals, if feasible, the frequent assumption here on their practical
feasibility seems to rest on assumptions of continued economic, technical, and medical progress,
which are actually contrary to recent evidence. (To put my own cards on the table—I’m an Alcor member,
but more out of status quo bias from the 1990s, when it looked like Drexler/Merkle nanotech was going
to be funded and succeed and be applied to medicine on a timescale of a decade or so. Didn’t happen.)
An alternative to the consensus view on LW, equally physicalist, is to see significant further healthy life span gains as unlikely. Under the practical technological and medical constraints that we face, it might be more helpful
to look towards easier assisted suicide than towards cryonics and similar options that rely heavily on progress
that has in many ways stalled.
I don’t know if it’s just me, but I have to say that I don’t get the impression that cryonics and those other topics are tribal beliefs here.
They are popular, but tribal beliefs aren’t the same as merely popular topics. Rather, it’s the stuff which gets taken for granted by the supermajority of those who bother or dare to speak up, and which makes for easy applause lights.
The feasability of cryonics and the rationality of the choice to get frozen have been points of very real debates, and if the author of this post chose to say something in favor of cryonics as an example of a “key rationality point”, I bet that would get challenged quite readily.
Also, at the risk of testing everyone’s tolerance for the density of MarkusRamikin posts on a page (sorry!) I’d like to make something clear. I fear I might sound like I think:
scarcity of debate → tribal belief → bad.
That is not so. I’ve no love for fake debate for the sake of debate, and I don’t think the fact that some beliefs are shared so widely here that there is virtually no debate is in itself wrong. Even in the best rationalist community you could imagine, this would happen—precisely because rationality is supposed to help us narrow down on true beliefs, which necessarily means that if our rationality is on the whole greater than the wider society’s, our beliefs should show convergence.
Everyone agreeing that one thing is more likely than any alternative (MWI for instance) does not mean that there is consensus about how likely it or other things are.
Fair enough. I’m not claiming that there is a supermajority solidly convinced of the practical feasibility of cryonics, and significant life extension, and immortalism. The impression that I get is more nearly that most of the hypotheticals that I see here posit more medical and technical progress than is supported by observation. Now, these are hypotheticals—for instance the discussion of consequences of various (large!) degrees of life extension (starting at 1000-year lifespans) in the responses to the original post on this page. It is perfectly valid to discuss improbable hypotheticals. Nonetheless, I get the impression that very few of the hypotheticals explored on LW posit something close to the stagnation that we’ve actually seen in many fields. Perhaps it doesn’t count as a tribal belief, but it does seem to set a tone of the discussion here—and not in the direction of making the discussion less wrong :-)
I see it as an applause light. On Less Wrong, talking about consciousness arising from physical structures is a way to show your allegiance to reductionism and physicalism, and you can hardly go wrong doing that.
The reason I’m being a pain about this sort of stuff is that I have an allergy to signs that our tribal beliefs—atheism, physicalism, MWI, memetics, reductionism etc—are being used as synonyms for rationality. They’re just ideas, theories. Even if they’re all correct (I’m not sure on some of them), if reality were different, they’d be wrong. So they’re hardly the essence, or key ideas, of rationality, the way map/territory, evidence, Bayes Theorem etc can be said to be key concepts of rationality.
I quoted that because it illustrates my point: reality is “somehow separate from my very best hypotheses”, all those specific “rational” beliefs might need to be abandoned at some point, and the methods of rationality would still apply.
So when merely professing these beliefs is somehow taken as imparting a lesson on rationality (like when a quote about memetics gets upvoted as a rationality quote, even though it carries no rationality lesson in itself), something feels wrong to me.
Agreed. To pick a more extreme example: Tribal beliefs here seem to also include cryonics, life-extensionism, and immortalism. While I agree with these as desirable goals, if feasible, the frequent assumption here on their practical feasibility seems to rest on assumptions of continued economic, technical, and medical progress, which are actually contrary to recent evidence. (To put my own cards on the table—I’m an Alcor member, but more out of status quo bias from the 1990s, when it looked like Drexler/Merkle nanotech was going to be funded and succeed and be applied to medicine on a timescale of a decade or so. Didn’t happen.)
Quoth Peter Thiel: “The single most important economic development in recent times has been the broad stagnation of real wages and incomes since 1973, the year when oil prices quadrupled. To a first approximation, the progress in computers and the failure in energy appear to have roughly canceled each other out. Like Alice in the Red Queen’s race, we (and our computers) have been forced to run faster and faster to stay in the same place.”
An alternative to the consensus view on LW, equally physicalist, is to see significant further healthy life span gains as unlikely. Under the practical technological and medical constraints that we face, it might be more helpful to look towards easier assisted suicide than towards cryonics and similar options that rely heavily on progress that has in many ways stalled.
I don’t know if it’s just me, but I have to say that I don’t get the impression that cryonics and those other topics are tribal beliefs here.
They are popular, but tribal beliefs aren’t the same as merely popular topics. Rather, it’s the stuff which gets taken for granted by the supermajority of those who bother or dare to speak up, and which makes for easy applause lights.
The feasability of cryonics and the rationality of the choice to get frozen have been points of very real debates, and if the author of this post chose to say something in favor of cryonics as an example of a “key rationality point”, I bet that would get challenged quite readily.
Also, at the risk of testing everyone’s tolerance for the density of MarkusRamikin posts on a page (sorry!) I’d like to make something clear. I fear I might sound like I think:
scarcity of debate → tribal belief → bad.
That is not so. I’ve no love for fake debate for the sake of debate, and I don’t think the fact that some beliefs are shared so widely here that there is virtually no debate is in itself wrong. Even in the best rationalist community you could imagine, this would happen—precisely because rationality is supposed to help us narrow down on true beliefs, which necessarily means that if our rationality is on the whole greater than the wider society’s, our beliefs should show convergence.
That this community consensus leads to some tribalism is probably an unavoidable side effect. But it’s the sort of entropy we need to remain vigilant for and pump out.
depth != breadth
Everyone agreeing that one thing is more likely than any alternative (MWI for instance) does not mean that there is consensus about how likely it or other things are.
Fair enough. I’m not claiming that there is a supermajority solidly convinced of the practical feasibility of cryonics, and significant life extension, and immortalism. The impression that I get is more nearly that most of the hypotheticals that I see here posit more medical and technical progress than is supported by observation. Now, these are hypotheticals—for instance the discussion of consequences of various (large!) degrees of life extension (starting at 1000-year lifespans) in the responses to the original post on this page. It is perfectly valid to discuss improbable hypotheticals. Nonetheless, I get the impression that very few of the hypotheticals explored on LW posit something close to the stagnation that we’ve actually seen in many fields. Perhaps it doesn’t count as a tribal belief, but it does seem to set a tone of the discussion here—and not in the direction of making the discussion less wrong :-)
As false choices ignoring third alternatives go...this is an interesting one to set up.