Right now, as I’m writing this, someone coming across the LW home page would have some grounds to conclude that LW is advising:
reading interesting books or articles with quotable material
attending meetups
introspective exercises on why we reject some actions
watching your thoughts and words
brainstorming exercises
measuring your aversions
writing (or maybe reading) horoscopes
(I’m omitting posts which appear to be purely informational.)
Over the course of the next few weeks, this list will change until new content has entirely replaced the old; at that point if you asked again the question “what is LW advising” you’d see something different, maybe with substantial overlap with the above list, maybe not.
So that is one procedure to (attempt to) determine what are LW’s major themes of short-term advice.
My point is that different procedures may yield different results, for different readerships.
Cryonics comes up every so often, but may or may not be perceived as a major theme—depending on how you read LW.
ETA: if you’re going to count all-time upvotes, then it would make more sense for me to do a systematic survey: rank all posts by number of upvotes, possibly normalize by how long ago the material has been posted (more recent material has had less time to accumulate upvotes), extract from each post what advice it gives if applicable. What seems to be going on for both you and the OP is that you rank as “major” the things that have struck you the most. They may have struck you the most precisely because they were most unconventional, in which case you will come to unsound conclusions.
This doesn’t prove anything, but I thought it was interesting. You can conduct your own searches, what results do you anticipate on a site like lesswrong if it cares most strongly about rationality and much less about topics like AI and cryonics?
Let me try to clarify what I mean.
Right now, as I’m writing this, someone coming across the LW home page would have some grounds to conclude that LW is advising:
reading interesting books or articles with quotable material
attending meetups
introspective exercises on why we reject some actions
watching your thoughts and words
brainstorming exercises
measuring your aversions
writing (or maybe reading) horoscopes
(I’m omitting posts which appear to be purely informational.)
Over the course of the next few weeks, this list will change until new content has entirely replaced the old; at that point if you asked again the question “what is LW advising” you’d see something different, maybe with substantial overlap with the above list, maybe not.
So that is one procedure to (attempt to) determine what are LW’s major themes of short-term advice.
My point is that different procedures may yield different results, for different readerships.
Cryonics comes up every so often, but may or may not be perceived as a major theme—depending on how you read LW.
ETA: if you’re going to count all-time upvotes, then it would make more sense for me to do a systematic survey: rank all posts by number of upvotes, possibly normalize by how long ago the material has been posted (more recent material has had less time to accumulate upvotes), extract from each post what advice it gives if applicable. What seems to be going on for both you and the OP is that you rank as “major” the things that have struck you the most. They may have struck you the most precisely because they were most unconventional, in which case you will come to unsound conclusions.
site:lesswrong.com “artificial intelligence” = 30,700 results
site:lesswrong.com rationality = 13,500 results
site:lesswrong.com “Singularity” = 32,000 results
site:lesswrong.com bias = 5,230 results
site:lesswrong.com “cryonics” = 1,680 results
site:lesswrong.com bayes = 1,660 results
site:lesswrong.com “evolutionary psychology” = 804 results
site:lesswrong.com “Bayes’ theorem” = 689 results
This doesn’t prove anything, but I thought it was interesting. You can conduct your own searches, what results do you anticipate on a site like lesswrong if it cares most strongly about rationality and much less about topics like AI and cryonics?
Thought this was because of the logo at the top of the page, so searched for “Singularity Institute for Artificial Intelligence” and got:
site:lesswrong.com “Singularity Institute for Artificial Intelligence” = 111,000 results
So something’s weird. Also, if you move “site:lesswrong.com″ to the right side you get 116,000 instead.
Google’s result counter is an estimate, and not a very good one. It’s within 2 or 3 orders of magnitude… usually.
You’re right.
Or maybe those result counts don’t measure what you think they measure.