Who said anything about mindhacking?
Raemon did. It’s a ritual, deliberately styled after religious rituals, some of the most powerful mindhacks known.
Who said anything about mindhacking?
Raemon did. It’s a ritual, deliberately styled after religious rituals, some of the most powerful mindhacks known.
As of now, there is no evidence that the average LessWronger is more rational than the average smart, educated person (see the LW poll). Therefore, a lot of LWers thinking something is not any stronger evidence for its truth than any other similarly-sized group of smart, educated people thinking it. Therefore, until we get way better at this, I think we should be humble in our certainty estimates, and not do mindhacky things to cement the beliefs we currently hold.
The line that people tend to quote there is “מנהג ישראל דין הוא” (the custom of Israel is law), but most people have never looked up its formal definition. Its actual halachic bearing is much too narrow to justify (for example) making kids sing Shabbat meal songs.
Correct me if I’m wrong, but it looks like you’re talking about anti-deathism (weak or strong) as if it was a defining value of the LessWrong community. This bothers me.
If you’re successful, these rituals will become part of the community identity, and I personally would rather LW tried to be about rationality and just that as much as it can. Everything else that correlates with membership—transhumanism, nerdiness, thinking Eliezer is awesome—I would urge you not to include in the rituals. It’s inevitable that they’d turn up, but I wouldn’t give them extra weight by including them in codified documents.
As an analogy, one of the things that bugged me about Orthodox Judaism was that it claims to be about keeping the Commandments, but there’s a huge pile of stuff that’s done just for tradition’s sake, that isn’t commanded anywhere (no, not even in the Oral Lore or by rabbinical decree).
So everyone in the human-superiority crowd gloating about how they’re superior to mere machines and formal systems, because they can see that Godel’s Statement is true just by their sacred and mysterious mathematical intuition… ”...Is actually committing a horrendous logical fallacy [...] though there’s a less stupid version of the same argument which invokes second-order logic.”
So… not everyone. In Godel, Escher, Bach, Hofstadter presents the second-order explanation of Godel’s Incompleteness Theorem, and then goes on to discuss the “human-superiority” crowd. Granted, he doesn’t give it much weight—but for reasons that have nothing to do with first- versus second-order logic.
Don’t bash a camp just because some of their arguments are bad. Bash them because their strongest argument is bad, or shut up.
(To avoid misunderstanding: I think said camp is in fact wrong.)
I perceive most of signalling as a waste of resources and think that cultivating a community which tried to minimize unnecessary signalling would be good.
Correcting spelling errors doesn’t waste many resources. But yeah, the amount of pointless signalling that goes on in the nerd community is kind of worrying.
Why do I do it myself? Force of habit, probably. I was the dumbest person in my peer group throughout high school, so I had to consciously cultivate an image that made me worth their attention, which I craved.
It’s kind of saddening that this kind of problem draws my attention much quicker than serious logical problems.
To be fair, they’re a hell of a lot easier to notice. Although there’s probably a signalling issue involved as well—particular kinds of pedantry are good ways of signalling “nerdiness”, and I think most LWers try to cultivate that kind of image.
The founders of Castify are big fans of Less Wrong so their rolling out their beta with some of our content.
Twitch.
But seriously, this is great. I’m trying to get into the habit of using podcasts and recorded lectures to make better use of my time, especially while travelling.
Stealing?
I took “spiritual” to mean in this context that you don’t believe in ontologically basic mental entities, but still embrace feelings of wonder, majesty, euphoria, etc. typically associated with religions when contemplating/experiencing the natural world.
Notice that other people answering my question had different interpretations. I left it blank.
Do you not have a preference for low/high redistribution of wealth because you haven’t studied enough economics, or because you have studied economics and haven’t found a satisfying answer?
Because I haven’t studied economics beyond the Wikipedia level, and systems with large numbers of humans involved are really, really complicated. Why so many democratic citizens feel qualified to intuit their way to an opinion is beyond me.
Two questions, as I take the survey:
What does “spiritual” mean, in the context of “Atheist [but | and not] spiritual”?
I genuinely have no idea whether I’d prefer low or high redistribution of wealth. What do I tick for my political opinion?
Depends what you mean by “familiar”. I’d imagine anyone reading the essay can do algebra, but that they’re still likely to be more comfortable when presented with specific numbers. People are weird like that—we can learn general principles from examples more easily than from having the general principles explained to us explicitly.
Exceptions abound, obviously.
Remove from your life everything you forget; what is left is you.
Can we just agree that English doesn’t have a working definition for “self”, and that different definitions are helpful in different contexts? I don’t think there’s anything profound in proposing definitions for words that fuzzy.
I think it does. Can’t believe I missed that.
Actually, this fits well with my personal experience. I’ve frequently found it easier to verbalize sophisticated arguments for the other team, since my own opinions just seem self-evident.
I suspect sheep would be less susceptible to this sort of thing than humans.
The study asked people to rate their position on a 9-point scale. People who took more extreme positions, while more likely to detect the reversal, also gave the strongest arguments in favour of the opposite opinion when they failed to detect the reversal.
Also, the poll had two kinds of questions. Some of them were general moral principles, but some of them were specific statements.
“Easy to communicate to other humans”, “easy to understand”, or “having few parts”.
Am I the only one who thinks we should stop using the word “simple” for Occam’s Razor / Solomonoff’s Whatever? In 99% of use-cases by actual humans, it doesn’t mean Solomonoff induction, so it’s confusing.
Don’t think you can fuck with people a lot more powerful than you are and get away with it.
I’m no expert, but that seems to be the moral of a lot of Greek myths.
Oh, sorry, neither did I. I’m not trying to accuse Raemon of deliberate brainwashing. But getting together every year to sings songs about, say, existential risk will make people more likely to disregard evidence showing that X-risk is lower than previously thought. Same for every other subject.