It seems to me like heavy users of Wikipedia are more likely to fill out a survey for Wikipedia users. On the other hand there’s no similar filter for the survey monkey audience.
Good point! Something I thought a bit about but didn’t get around to discussing in this post. The Slate Star Codex audience returned a total of 618 responses. I don’t have a very good idea of how many people read the SSC blog carefully enough to go through all the links, but my best guess is that that number is in the low thousands. If that’s the case the response rate is 15% or higher. This is still low but not that low.
Another way of framing this: how low would the response rate have to be for the true SSC readership to be like the SurveyMonkey Audience or Google Surveys audiences? Based on the numbers it seems like the selection bias would have to be really strong for that to happen.
So while I don’t think selection for Wikipedia specifically is the driving factor here, it could be that rather than talk about SSC readership, it makes more sense to talk about “SSC readers who are devoted enough and curious enough to read through every link in the link roundup.”
On a related note, effective response rates for on-site Wikipedia surveys (which we didn’t discuss here, but might be the subject of future posts) can be around 0.1% to 0.2%, see for instance Why We Read Wikipedia (to get the response rate you would need to use existing info on the number of pageviews to Wikipedia; I have emailed the researchers and confirmed that the response rate was in that ballpark). Compared to that, the SSC response rate seems pretty high and more definitely informative about the population.
It might be possible to get Scott to include the “number of Wikipedia pages read per week” into his next census. That would give more accurate base rates.
The annual LessWrong survey might be another place to consider putting it. I don’t know who’s responsible for doing it in 2017, but when I find out I’ll ask them.
The 2017 SSC Survey had 5500 respondents. Presumably this survey was more widely visible and available than mine (which was one link in the middle of a long link list).
It seems to me like heavy users of Wikipedia are more likely to fill out a survey for Wikipedia users. On the other hand there’s no similar filter for the survey monkey audience.
Good point! Something I thought a bit about but didn’t get around to discussing in this post. The Slate Star Codex audience returned a total of 618 responses. I don’t have a very good idea of how many people read the SSC blog carefully enough to go through all the links, but my best guess is that that number is in the low thousands. If that’s the case the response rate is 15% or higher. This is still low but not that low.
Another way of framing this: how low would the response rate have to be for the true SSC readership to be like the SurveyMonkey Audience or Google Surveys audiences? Based on the numbers it seems like the selection bias would have to be really strong for that to happen.
So while I don’t think selection for Wikipedia specifically is the driving factor here, it could be that rather than talk about SSC readership, it makes more sense to talk about “SSC readers who are devoted enough and curious enough to read through every link in the link roundup.”
On a related note, effective response rates for on-site Wikipedia surveys (which we didn’t discuss here, but might be the subject of future posts) can be around 0.1% to 0.2%, see for instance Why We Read Wikipedia (to get the response rate you would need to use existing info on the number of pageviews to Wikipedia; I have emailed the researchers and confirmed that the response rate was in that ballpark). Compared to that, the SSC response rate seems pretty high and more definitely informative about the population.
It might be possible to get Scott to include the “number of Wikipedia pages read per week” into his next census. That would give more accurate base rates.
Good idea, but I don’t think he does the census that frequently. The most recent one I can find is from 2014: http://slatestarcodex.com/2015/11/04/2014-ssc-survey-results/
The annual LessWrong survey might be another place to consider putting it. I don’t know who’s responsible for doing it in 2017, but when I find out I’ll ask them.
The 2017 SSC Survey had 5500 respondents. Presumably this survey was more widely visible and available than mine (which was one link in the middle of a long link list).
https://slatestarcodex.com/2017/03/17/ssc-survey-2017-results/