Nomination: Common Sense Atheism.
BenAlbahari
Blogroll / Side Bar Section for Links to Rationality Related Websites. I love Overcoming Bias, but it seems a bit biased that Overcoming Bias is the only other website linked from here.
Reply to this comment with a comment for each website nomination?
Hmm… maybe with this feature new links could be added by users (presuming a minimum karma criteria), and then each link other users could vote up and down, so that the ordering of the list was organic.
Emotional awareness is a skill that can be cultivated, and increases one’s agreeableness. Watch a disagreeable person in action and it’s pretty obvious that they’re not really picking up how other people are reacting to their behavior. Note that it’s much easier to see disagreeable behavior is in others than in oneself. The challenge in becoming more agreeable lies partly in seeing oneself as others see you.
if you really want to know how valid a particular idea you’ve read is, there are quantitative ways to get closer to answering that question.
The ultimate in quantitative analysis is to have a system predict what your opinion should be on any arbitrary issue. The TakeOnIt website does this by applying a collaborative filtering algorithm on a database of expert opinions. To use it you first enter opinions on issues that you understand and feel confident about. The algorithm can then calculate which experts you have the highest correlation in opinion with. It then extrapolates what your opinion should be on issues you don’t even know about, based on the assumption that your expert agreement correlation should remain constant. I explained the concept in more detail a while ago on Less Wrong here, but have since actually implemented the feature. Here are TakeOnIt’s predictions of Eliezer’s opinions. The more people add expert opinions to the database, the more accurate the predictions become.
Note that the website currently requires you to publicly comment on an issue in order to get your opinion predictions. A few people have requested that you should be able to enter your opinion without having to comment. If enough people want this, I’ll implement that feature.
one of my dreams is that one day we could develop tools that … allowed you to estimate how contentious that claim was, how many sources were for and against it… and links to … tell you about who holds what opinions, and allowed you to somewhat automate the process of reading and making sense of what other people wrote.
That’s more or less the goal of TakeOnIt. I’d stress that the biggest challenge here is populating the database of expert opinions rather than building the tools.
An even more ambitious project: making a graph of which studies invalidate or cast doubt on which other studies, on a very big scale, so you could roughly pinpoint the most certain or established areas of science. This would require some kind of systematic method of deducing implication, though.
Each issue on TakeOnIt can be linked to any other issue by adding an “implication” between two issues. Green arrows link supporting positions; red arrows link contradictory positions. So for example, the issue of cryonics links to several other issues, such as the issue of whether information-theoretic death is the most real interpretation of death (which if true, supports the case for cryonics).
I guess the moral is “Don’t trust anyone but a mathematician”?
Safety in numbers? ;)
Perhaps it’s useful to distinguish between the frontier of science vs. established science. One should expect the frontier to be rather shaky and full of disagreements, before the winning theories have had time to be thoroughly tested and become part of our scientific bedrock. There was a time after all when it was rational for a layperson to remain rather neutral with respect to Einstein’s views on space and time. The heuristic of “is this science established / uncontroversial amongst experts?” is perhaps so boring we forget it, but it’s one of the most useful ones we have.
To evaluate a contrarian claim, it helps to break down the contentious issue into its contentious sub-issues. For example, contrarians deny that global warming is caused primarily by humans, an issue which can be broken down into the following sub-issues:
Have solar cycles significantly affected earth’s recent climate?
Does cosmic radiation significantly affect earth’s climate?
Has earth’s orbit significantly affected its recent climate?
Does atmospheric CO2 cause significant global warming?
Do negative feedback loops mostly cushion the effect of atmospheric CO2 increases?
Are recent climatic changes consistent with the AGW hypothesis?
Is it possible to accurately predict climate?
Have climate models made good predictions so far?
Are the causes of climate change well understood?
Has CO2 passively lagged temperature in past climates?
Are climate records (of temperature, CO2, etc.) reliable?
Is the Anthropogenic Global Warming hypothesis falsifiable?
Does unpredictable weather imply unpredictable climate?It’s much easier to assess the liklihood of a position once you’ve assessed the liklihood of each of its supporting positions. In this particular case, I found that the contrarians made a very weak case indeed.
- Nov 28, 2014, 8:25 AM; 0 points) 's comment on Some Heuristics for Evaluating the Soundness of the Academic Mainstream in Unfamiliar Fields by (
If you have social status, it is worth sparing some change in getting used to not only being wrong, but being socially recognized as wrong by your peers...
Emperor Sigismund, when corrected on his Latin, famously replied:
I am king of the Romans and above grammar.
I know that most men — not only those considered clever, but even those who are very clever and capable of understanding most difficult scientific, mathematical, or philosophic, problems — can seldom discern even the simplest and most obvious truth if it be such as obliges them to admit the falsity of conclusions they have formed, perhaps with much difficulty — conclusions of which they are proud, which they have taught to others, and on which they have built their lives.
— Leo Tolstoy, 1896 (excerpt from “What Is Art?”)
Illusory superiority seems to be the cognitive bias to overcome here.
Voted up.
if you have to choose between fitting in with your group etc and believing the truth, you should shun the truth.
I think many people develop a rough map of other people’s beliefs, to the extent that they avoid saying things that would compromise fitting in the group they’re in. Speaking of which:
irrationalists free-ride on the real-world achievements of rationalists
Trying to get to level 4 are we? (Clearly I’m not ;)) Conversely, you could argue that “irrationalists” are better at getting things done due to group leverage and rationalists free-ride of those achievements.
Perhaps it would be a good idea to remember, and keep remembering, and make it clear in your writing, that “women” are not a monolithic block and don’t all want the same thing.
A woman who doesn’t want a generalization applied to them? :)
For me, understanding “what’s really going on” in typical social interactions made them even less interesting than when I didn’t.
Merely “tuning in” to a social interaction isn’t enough. Subtextual conversations are often tedious if they’re not about you. You have to inject your ego into the conversation for things to get interesting.
So if I’m with a bunch of people from my class … and none of us have any major conflict of interest...
If you were a character in a sitcom I was writing, I’d have your dream girl walk in just as you were saying that.
It seems this post bundled together the CPU vs. GPU theory regarding the AS vs. NT mindset, with a set of techniques on how to improve social skills. The techniques however—and in a sense this is a credit to the poster—are useful to anyone who wants to improve their social skills, regardless of whether the cause of their lack of skill is:
1) High IQ
2) Introversion
3) Social Inexperience
4) AS
5)A combination of several of these factors might be the cause of social awkwardness. It’s possible to place too much importance on looking for a root cause. The immediate cause is simply a lack of understanding of social interaction—the techniques will help anyone develop that understanding.
If you lack that powerful social coprocessor… [you will]...explicitly reason through the complex human social game that most people play without ever really understanding.
Some NTs are somewhat unconscious of the game, but that doesn’t mean they don’t understand it. I’d argue the most useful definition of “understanding” is that one’s brain contains the knowledge—whether one is conscious of it or not—that enables one to successfully perform the relevant task. Any other definition, is quite literally, academic. Furthermore, I’d argue that those best at the game actually become conscious of what is unconscious for most people, such as the degree to which status plays a role in social interaction. This helps them gain an edge over others, such as better predicting the ramifications of gossip, or the ability to construct a joke. A joke that works well socially, often consists of the more socially aware person bringing to the surface an aspect of someone else’s self-serving behavior that was previously just under the social group’s conscious radar. It would be impossible to construct such jokes without a conscious understanding of the game.
(Most importantly) Find a community of others—who are trying to solve the same problem
If you want to learn social skills, hang out with people who have them. And it’s not enough to just hang out—you have to enjoy it and participate. And to be frank, often the easiest way to do that is with alcohol. And don’t assume you’re so different to other people—why do you think they’re drinking?
Thanks for the feedback.
there’s a lot of chaff.
Do you mean chaff as in “stuff that I personally don’t care about” or chaff as in “stuff that anyone would agree is bad”?
there doesn’t seem to be enough activity yet.
Yes, the site is still in the bootstrapping phase. Having said that, the site needs to have a better way of displaying recent activity.
Franklin’s quote is more about cryonics being good if it were feasible than if it is feasible. Ben, do you think it should be moved to this question?
Good call.
to even include some of these people together is simply to give weight to views which should have effectively close to zero weight.
No no no! It’s vital that the opinions of influential people—even if they’re completely wrong—are included on TakeOnIt. John Stuart Mill makes my point perfectly:
...the peculiar evil of silencing the expression of an opinion is… If an opinion is right, [people] are deprived of the opportunity of exchanging error for truth: if wrong, they lose what is almost as great a benefit, the clearer perception and livelier impression of the truth, produced by its collision with error.
P.S. I updated the tag line for Conservapedia from “Encyclopedia” to “Christian Encyclopedia”. Thanks for pointing that out.
TakeOnIt records the opinions of BOTH experts and influencers—not just experts. Perhaps I confused you by not being clear about this in my original comment. In any case, TakeOnIt groups opinions by the expertise of those who hold the opinions. This accentuates—not blurs—the distinction between those who have relevant expertise and those who don’t (but who are nonetheless influential). It also puts those who have expertise relevant to the question topic at the top of the page. You seem to be saying readers will easily mistake an expert for an influencer. I’m open to suggestions if you think it could be made clearer than it is.
A website has a specific goal that it’s trying to uniquely achieve, and a general goal that places it within a community of like-minded websites. Less Wrong’s specific goal is to refine the art of human rationality, and its general goal is to raise the sanity waterline. If other websites are successfully raising the sanity waterline, it behooves Less Wrong to link to them.
I agree that there’s genuine challenges in selecting which websites to link to, especially for a community blog. But a community blog, if it meets those challenges, actually has the greater potential to choose a good set of links. Less Wrong should strive to have a better set of links than its sister site, Overcoming Bias. These links matter. It’s a standard feature of blogs, and for good reason. I’ve discovered many great websites this way. Unfortunately, never via Less Wrong.
While I think high-karma Less Wrong users deserve promotion, it’s not the only criteria for which promotion is justified. If there’s a great sanity waterline raising website out there, it should be linked to, whether or not there’s a high-karma Less Wrong user running it. On my own website I link to Wikipedia’s argument fallacy list and cognitive bias list. Without digressing into a debate as to whether Less Wrong should link to these lists too, I’ll merely point out that with the criteria you’re suggesting, such links would necessarily have zero value. I think JGWeissman’s proposal would choose the appropriate value for such links.