I like the idea of verified experts and fact-checking, though in practice I have a much smaller expectation of the mass public obtaining high quality experts.
I feel like my post is going to be way too negative and I’m sick right now with my cognition mildly compromised, but I like the idea of effectively spreading good ideas so I’ll make it anyway.
It also makes it much easier for people with shared interests to get in contact.
This is red flag #1 for me. Even with the ability to use the internet to connect to massive networks of people who have different ideas than use people still seem to have the habit of connecting with people who are already like-minded or close to like-minded. If your main social networking is Facebook and a list of subreddits that don’t annoy you then you’re less likely to be exposed to ideas that you would both consider different from your current mindset and worth considering. If everyone is spending time obtaining new ideas from their family, real-life friends (who have been selected for similar preferences), religious group, news channel that shares the same political bent as them, online-friends (from similar interests), and possibly worst of all a dictionary that shares the same political bent as them then they aren’t going to be getting quality new ideas.
Ways to break this cycle could include trojan-horsing people into it, creating multiple presentations/versions of an idea and then quietly exposing people in separate groups to them in ways that are most effective at gaining that one group’s following, figuring out ways of presenting information that skips over worldviews (I’m assuming this is hard), or some other method someone cleverer than me on here can think of. If you want to do something like get everyone to read the Sequences then you could create a handful of versions that covertly make bad but initially believable attempts to convince them that their current worldview is correct. (I may go to rationalist hell just for saying that and please no one attempt it without reading comments to this making it clear what a stupid idea it is.)
This will of course lead to a tremendous increase in the amount of false or useless information. But it will also lead to an increase in true and relevant information.
People still have limited time and limited resources for obtaining information from. Figuring out ways to decrease the false and useless information obtained and increasing the true and relevant information at the same time would be handy.
We also run into the problem that experts are wrong on occasion and our best attempts at true and relevant information fail. Correcting widely spread incorrect information should have a higher priority than a tiny blurb in the hard-to-see corner of a newspaper. Are there any blogs for “Things You Were Told By Experts Not Very Long Ago That We Are Now Pretty Damn Sure Are Not Correct”? If so, please tell me.
Now members of the cognitive elite are, or so I claim, reasonably good at distinguishing between good and bad ideas.
They do? I thought we just pushed ideas that match up to our worldviews, are novel, and we think other people will enjoy/give us status for.
On the other hand, [non-elite groups] are likely to be heavily influenced by the cognitive elite, especially in the longer run.
In the long run, yeah I can see that. People are influenced by new political ideologies and you have to be pretty smart to think of a brand new political ideology (to think of a single example). However, I’m not sure the pathways that ideas follow from the cognitive elite to the cognitive lacking are pathways that select for the ideas that you might want to see spread. Ideas that spread are selected for Divine Plaguespreadingness (I want to use the term memetic virility here but I don’t know enough about memetics yet and don’t want to explain it away with a fancy term).
Finding ways to hack into the normal Divine Plaguespreadingness could be useful here or finding ways to skip the normal pathways that ideas follow for moving from higher IQ to lower IQ populations. This is also true of different populations with similar IQ. Rationality is likely nowhere near hitting critical mass for the high IQ population in the English-speaking world. If we can hit a larger percentage of that population then other populations may follow. HPMOR certainly falls into the criteria of branching into other populations that might benefit from rationality concepts but wouldn’t normally be exposed to LW. Are any other groups trying to branch into other populations in similar ways? The online Harry Potter fanfiction community almost certainly can’t be the single low-hanging fruit here, guys.
There are a couple of ways of addressing this problem. One is better reputation/karma systems. That would both incentivize people to disseminate true and relevant information, and make it easier to find true and relevant information.
Oh please god no! No no no no no. If you want to go with a handful of experts Metacritic or Rottentomatoes style then okay we can try that out. However, I don’t trust democracy to be the effective way of finding good ideas and spreading them out to people. Democracy is the way of getting the majority to be in control and slowly shift from their current position to another close-but-similar-and-widely-agreed-upon position. It just produces groupthink over time (which can be good politically and could help avoid political upheaval but isn’t good for what we’re wanting it for).
We would still need effective methods of doing double-blind reviews of things with future changes in opinion only available when don’t publicly and the past opinion still visible. That, or some other method that we can get people to make reviews while avoiding biases like signaling their in-group, matching positions with people they regard as higher IQ than them, and all of the other mind-killing biases out there.
I would suggest doing rationality tests, but (assuming people on LW are more rational. now that’s a worrying thought) we’d need to adjust for the fact that it would select for people who are exposed to LW rationality or read some of the popular sources like SSC. Besides, people who are rational don’t have a monopoly on good ideas and some ideas may be more easily obtainable if you are already heavily biased. (You don’t get loads of interesting attempted philosophical proofs of the existence of god in a nation full of atheists.) (edit you would get “loads”. see comments) I think I’m just rambling and disagreeing with myself at this point so I’ll stop this line of reasoning.
Another method is automatic quality-control of information (e.g. fact-checking). Google have done some work on this, but still, it is in its infancy. It’ll be interesting to follow the development in this area in the years to come.
I like fact-checking. Definitely an applause light going on there. If we’ve gotten the p-value of studies low enough, the vast majority of experts agree, and there are a notable lack of studies disagreeing then that could be very nice.
Has anyone tried setting up a quality-control engine similar to fact-checking but instead works as a meta-analysis generator for scientific journals? It would be nice to see something along the lines of “Vitamin D is good for you (Note: Only 55% of studies agree on this for improving X, Y, and Z. There is not general consensus on the benefits of Vitamin D for anything by bone health. However, Vitamin D has been recognized as safe to take by organization A so take it anyway and see what happens.” And then we can also automatically set you up to an internet database that will email you if that information changes later on so that you can update your behaviors accordingly.
Besides, people who are rational don’t have a monopoly on good ideas and some ideas may be more easily obtainable if you are already heavily biased. (You don’t get loads of interesting attempted philosophical proofs of the existence of god in a nation full of atheists.)
Given that one of the people working in CFAR produced a genuine new proof for it making sense to believe in God that made her convert to Catholicism while being employed in CFAR, I don’t see where you get that idea.
In my experience this community is very open to thinking all sorts of contrarian ideas.
That’s very interesting and I would be interested in seeing her proof. Was it a new idea that religious people had not thought of and spread before?
I should change my claim. People would be likely to think of “loads” of attempted proofs.
The theory I was trying to state is that certain perspectives or states of mind may be more effective at finding certain ideas than others. Calling “being a contrarian” might not be a good name for a perspective but I’ll treat it as such for the moment. Do you think if people at CFAR and LW (our local contrarians) were left to their own devices, they would occupy perspectives to reach every single one of the literally hundreds of proofs for the existence of god compared to people who are highly motivated by belief, social-utility, and dedication to an imagined highly dangerous omnipotent deity? Are there some that would be much harder to obtain or much easier?
I think what I may have been trying to get at was that while LW contrarianism is awesome and a pretty great method for thinking about things, it may not be the best for finding all the possible good ideas out there in idea space.
I like the idea of verified experts and fact-checking, though in practice I have a much smaller expectation of the mass public obtaining high quality experts.
I feel like my post is going to be way too negative and I’m sick right now with my cognition mildly compromised, but I like the idea of effectively spreading good ideas so I’ll make it anyway.
This is red flag #1 for me. Even with the ability to use the internet to connect to massive networks of people who have different ideas than use people still seem to have the habit of connecting with people who are already like-minded or close to like-minded. If your main social networking is Facebook and a list of subreddits that don’t annoy you then you’re less likely to be exposed to ideas that you would both consider different from your current mindset and worth considering. If everyone is spending time obtaining new ideas from their family, real-life friends (who have been selected for similar preferences), religious group, news channel that shares the same political bent as them, online-friends (from similar interests), and possibly worst of all a dictionary that shares the same political bent as them then they aren’t going to be getting quality new ideas.
Ways to break this cycle could include trojan-horsing people into it, creating multiple presentations/versions of an idea and then quietly exposing people in separate groups to them in ways that are most effective at gaining that one group’s following, figuring out ways of presenting information that skips over worldviews (I’m assuming this is hard), or some other method someone cleverer than me on here can think of. If you want to do something like get everyone to read the Sequences then you could create a handful of versions that covertly make bad but initially believable attempts to convince them that their current worldview is correct. (I may go to rationalist hell just for saying that and please no one attempt it without reading comments to this making it clear what a stupid idea it is.)
People still have limited time and limited resources for obtaining information from. Figuring out ways to decrease the false and useless information obtained and increasing the true and relevant information at the same time would be handy.
We also run into the problem that experts are wrong on occasion and our best attempts at true and relevant information fail. Correcting widely spread incorrect information should have a higher priority than a tiny blurb in the hard-to-see corner of a newspaper. Are there any blogs for “Things You Were Told By Experts Not Very Long Ago That We Are Now Pretty Damn Sure Are Not Correct”? If so, please tell me.
They do? I thought we just pushed ideas that match up to our worldviews, are novel, and we think other people will enjoy/give us status for.
In the long run, yeah I can see that. People are influenced by new political ideologies and you have to be pretty smart to think of a brand new political ideology (to think of a single example). However, I’m not sure the pathways that ideas follow from the cognitive elite to the cognitive lacking are pathways that select for the ideas that you might want to see spread. Ideas that spread are selected for Divine Plaguespreadingness (I want to use the term memetic virility here but I don’t know enough about memetics yet and don’t want to explain it away with a fancy term).
Finding ways to hack into the normal Divine Plaguespreadingness could be useful here or finding ways to skip the normal pathways that ideas follow for moving from higher IQ to lower IQ populations. This is also true of different populations with similar IQ. Rationality is likely nowhere near hitting critical mass for the high IQ population in the English-speaking world. If we can hit a larger percentage of that population then other populations may follow. HPMOR certainly falls into the criteria of branching into other populations that might benefit from rationality concepts but wouldn’t normally be exposed to LW. Are any other groups trying to branch into other populations in similar ways? The online Harry Potter fanfiction community almost certainly can’t be the single low-hanging fruit here, guys.
Oh please god no! No no no no no. If you want to go with a handful of experts Metacritic or Rottentomatoes style then okay we can try that out. However, I don’t trust democracy to be the effective way of finding good ideas and spreading them out to people. Democracy is the way of getting the majority to be in control and slowly shift from their current position to another close-but-similar-and-widely-agreed-upon position. It just produces groupthink over time (which can be good politically and could help avoid political upheaval but isn’t good for what we’re wanting it for).
We would still need effective methods of doing double-blind reviews of things with future changes in opinion only available when don’t publicly and the past opinion still visible. That, or some other method that we can get people to make reviews while avoiding biases like signaling their in-group, matching positions with people they regard as higher IQ than them, and all of the other mind-killing biases out there.
I would suggest doing rationality tests, but (assuming people on LW are more rational. now that’s a worrying thought) we’d need to adjust for the fact that it would select for people who are exposed to LW rationality or read some of the popular sources like SSC. Besides, people who are rational don’t have a monopoly on good ideas and some ideas may be more easily obtainable if you are already heavily biased.
(You don’t get loads of interesting attempted philosophical proofs of the existence of god in a nation full of atheists.)(edit you would get “loads”. see comments) I think I’m just rambling and disagreeing with myself at this point so I’ll stop this line of reasoning.I like fact-checking. Definitely an applause light going on there. If we’ve gotten the p-value of studies low enough, the vast majority of experts agree, and there are a notable lack of studies disagreeing then that could be very nice.
Has anyone tried setting up a quality-control engine similar to fact-checking but instead works as a meta-analysis generator for scientific journals? It would be nice to see something along the lines of “Vitamin D is good for you (Note: Only 55% of studies agree on this for improving X, Y, and Z. There is not general consensus on the benefits of Vitamin D for anything by bone health. However, Vitamin D has been recognized as safe to take by organization A so take it anyway and see what happens.” And then we can also automatically set you up to an internet database that will email you if that information changes later on so that you can update your behaviors accordingly.
Given that one of the people working in CFAR produced a genuine new proof for it making sense to believe in God that made her convert to Catholicism while being employed in CFAR, I don’t see where you get that idea.
In my experience this community is very open to thinking all sorts of contrarian ideas.
That’s very interesting and I would be interested in seeing her proof. Was it a new idea that religious people had not thought of and spread before?
I should change my claim. People would be likely to think of “loads” of attempted proofs.
The theory I was trying to state is that certain perspectives or states of mind may be more effective at finding certain ideas than others. Calling “being a contrarian” might not be a good name for a perspective but I’ll treat it as such for the moment. Do you think if people at CFAR and LW (our local contrarians) were left to their own devices, they would occupy perspectives to reach every single one of the literally hundreds of proofs for the existence of god compared to people who are highly motivated by belief, social-utility, and dedication to an imagined highly dangerous omnipotent deity? Are there some that would be much harder to obtain or much easier?
I think what I may have been trying to get at was that while LW contrarianism is awesome and a pretty great method for thinking about things, it may not be the best for finding all the possible good ideas out there in idea space.