2016 LessWrong Diaspora Survey Analysis: Part Four (Politics, Calibration & Probability, Futurology, Charity & Effective Altruism)

INTRODUCTION HERE

Politics

The LessWrong survey has a very involved section dedicated to politics. In previous analysis the benefits of this weren’t fully realized. In the 2016 analysis we can look at not just the political affiliation of a respondent, but what beliefs are associated with a certain affiliation. The charts below summarize most of the results.

Political Opinions By Political Affiliation



































Miscellaneous Politics

There were also some other questions in this section which aren’t covered by the above charts.

PoliticalInterest

On a scale from 1 (not interested at all) to 5 (extremely interested), how would you describe your level of interest in politics?

1: 67 (2.182%)

2: 257 (8.371%)

3: 461 (15.016%)

4: 595 (19.381%)

5: 312 (10.163%)

Voting

Did you vote in your country’s last major national election? (LW Turnout Versus General Election Turnout By Country)
Group Turnout
LessWrong 68.9%
Austrailia 91%
Brazil 78.90%
Britain 66.4%
Canada 68.3%
Finland 70.1%
France 79.48%
Germany 71.5%
India 66.3%
Israel 72%
New Zealand 77.90%
Russia 65.25%
United States 54.9%
Numbers taken from Wikipedia, accurate as of the last general election in each country listed at time of writing.

AmericanParties

If you are an American, what party are you registered with?

Democratic Party: 358 (24.5%)

Republican Party: 72 (4.9%)

Libertarian Party: 26 (1.8%)

Other third party: 16 (1.1%)

Not registered for a party: 451 (30.8%)

(option for non-Americans who want an option): 541 (37.0%)

Calibration And Probability Questions

Calibration Questions

I just couldn’t analyze these, sorry guys. I put many hours into trying to get them into a decent format I could even read and that sucked up an incredible amount of time. It’s why this part of the survey took so long to get out. Thankfully another LessWrong user, Houshalter, has kindly done their own analysis.

All my calibration questions were meant to satisfy a few essential properties:

  1. They should be ‘self contained’. I.E, something you can reasonably answer or at least try to answer with a 5th grade science education and normal life experience.

  2. They should, at least to a certain extent, be Fermi Estimable.

  3. They should progressively scale in difficulty so you can see whether somebody understands basic probability or not. (eg. In an ‘or’ question do they put a probability of less than 50% of being right?)

At least one person requested a workbook, so I might write more in the future. I’ll obviously write more for the survey.

Probability Questions

Question Sum Mean Median Mode Stdev
Please give the obvious answer to this question, so I can automatically throw away all surveys that don’t follow the rules: What is the probability of a fair coin coming up heads? 79913.9989998995 49.82169513709445 50.0 50.0 3.0335516853602247
What is the probability that the Many Worlds interpretation of quantum mechanics is more or less correct? 57667.2765532332 44.59959516878051 50.0 50.0 29.19375920063312
What is the probability that non-human, non-Earthly intelligent life exists in the observable universe? 108971.29058158398 75.72709560916192 90.0 99.0 31.89339696930562
…in the Milky Way galaxy? 62974.32026926869 45.966657130853065 50.0 10.0 38.39586433271989
What is the probability that supernatural events (including God, ghosts, magic, etc) have occurred since the beginning of the universe? 12584.604712170929 13.575625363722654 1.0 1.0 27.57620837269749
What is the probability that there is a god, defined as a supernatural intelligent entity who created the universe? 15505.547953131427 15.47459875562016 1.0 1.0 27.891434891369965
What is the probability that any of humankind’s revealed religions is more or less correct? 8573.60632673673 10.624047492858365 0.5 1.0 26.257350238788074
What is the probability that an average person cryonically frozen today will be successfully restored to life at some future time, conditional on no global catastrophe destroying civilization before then? 29397.64525290689 21.225736644698124 10.0 5.0 26.782993912650962
What is the probability that at least one person living at this moment will reach an age of one thousand years, conditional on no global catastrophe destroying civilization in that time? 34307.58545244187 25.26331771166561 10.0 1.0 30.5106781146967
What is the probability that our universe is a simulation? 30383.535304433874 25.25647157475802 10.0 50.0 28.404941122960626
What is the probability that significant global warming is occurring or will soon occur, and is primarily caused by human actions? 121545.09299969983 83.30712337196704 90.0 90.0 23.167415441959694
What is the probability that the human race will make it to 2100 without any catastrophe that wipes out more than 90% of humanity? 111031.97345414276 76.3106346763868 80.0 80.0 22.93312595923684

Calibration and Probability are probably the two areas of the survey I put the least effort into. My plan for next year is to overhaul these sections entirely and try including some Tetlock-esque forecasting questions, a link to some advice on how to make good predictions, etc.

Futurology

This section got a bit of a facelift this year. Including new cryonics questions, genetic engineering, and technological unemployment in addition to the previous years.

Cryonics

Cryonics

Are you signed up for cryonics?

Yes—signed up or just finishing up paperwork: 48 (2.9%)

No—would like to sign up but unavailable in my area: 104 (6.3%)

No—would like to sign up but haven’t gotten around to it: 180 (10.9%)

No—would like to sign up but can’t afford it: 229 (13.8%)

No—still considering it: 557 (33.7%)

No—and do not want to sign up for cryonics: 468 (28.3%)

Never thought about it /​ don’t understand: 68 (4.1%)

CryonicsNow

Do you think cryonics, as currently practiced by Alcor/​Cryonics Institute will work?

Yes: 106 (6.6%)

Maybe: 1041 (64.4%)

No: 470 (29.1%)

Interestingly enough, of those who think it will work with enough confidence to say ‘yes’, only 14 are actually signed up for cryonics.

sqlite> select count(*) from data where CryonicsNow=”Yes” and Cryonics=”Yes—signed up or just finishing up paperwork”;

14

sqlite> select count(*) from data where CryonicsNow=”Yes” and (Cryonics=”Yes—signed up or just finishing up paperwork” OR Cryonics=”No—would like to sign up but unavailable in my area” OR “No—would like to sign up but haven’t gotten around to it” OR “No—would like to sign up but can’t afford it”);

34

CryonicsPossibility

Do you think cryonics works in principle?

Yes: 802 (49.3%)

Maybe: 701 (43.1%)

No: 125 (7.7%)

LessWrongers seem to be very bullish on the underlying physics of cryonics even if they’re not as enthusiastic about current methods in use.

The Brain Preservation Foundation also did an analysis of cryonics responses to the LessWrong Survey.

Singularity

SingularityYear

By what year do you think the Singularity will occur? Answer such that you think, conditional on the Singularity occurring, there is an even chance of the Singularity falling before or after this year. If you think a singularity is so unlikely you don’t even want to condition on it, leave this question blank.

Sum: 1.0000000000590109e+20

Mean: 8.110300081581755e+16

Median: 2080.0

Mode: 2100.0

Stdev: 2.847858859055733e+18

I didn’t bother to filter out the silly answers for this.

Obviously it’s a bit hard to see without filtering out the uber-large answers, but the median doesn’t seem to have changed much from the 2014 survey.

Genetic Engineering

ModifyOffspring

Would you ever consider having your child genetically modified for any reason?

Yes: 1552 (95.921%)

No: 66 (4.079%)

Well that’s fairly overwhelming.

GeneticTreament

Would you be willing to have your child genetically modified to prevent them from getting an inheritable disease?

Yes: 1387 (85.5%)

Depends on the disease: 207 (12.8%)

No: 28 (1.7%)

I find it amusing how the strict “No” group shrinks considerably after this question.

GeneticImprovement

Would you be willing to have your child genetically modified for improvement purposes? (eg. To heighten their intelligence or reduce their risk of schizophrenia.)

Yes : 0 (0.0%)

Maybe a little: 176 (10.9%)

Depends on the strength of the improvements: 262 (16.2%)

No: 84 (5.2%)

Yes I know ‘yes’ is bugged, I don’t know what causes this bug and despite my best efforts I couldn’t track it down. There is also an issue here where ‘reduce your risk of schizophrenia’ is offered as an example which might confuse people, but the actual science of things cuts closer to that than it does to a clean separation between disease risk and ‘improvement’.

This question is too important to just not have an answer to so I’ll do it manually. Unfortunately I can’t easily remove the ‘excluded’ entries so that we’re dealing with the exact same distribution but only 13 or so responses are filtered out anyway.

sqlite> select count(*) from data where GeneticImprovement=”Yes”;

1100

>>> 1100 + 176 + 262 + 84
1622
>>> 1100 /​ 1622
0.6781750924784217

67.8% are willing to genetically engineer their children for improvements.

GeneticCosmetic

Would you be willing to have your child genetically modified for cosmetic reasons? (eg. To make them taller or have a certain eye color.)

Yes: 500 (31.0%)

Maybe a little: 381 (23.6%)

Depends on the strength of the improvements: 277 (17.2%)

No: 455 (28.2%)

These numbers go about how you would expect, with people being progressively less interested the more ‘shallow’ a genetic change is seen as.

GeneticOpinionD

What’s your overall opinion of other people genetically modifying their children for disease prevention purposes?

Positive: 1177 (71.7%)

Mostly Positive: 311 (19.0%)

No strong opinion: 112 (6.8%)

Mostly Negative: 29 (1.8%)

Negative: 12 (0.7%)

GeneticOpinionI

What’s your overall opinion of other people genetically modifying their children for improvement purposes?

Positive: 737 (44.9%)

Mostly Positive: 482 (29.4%)

No strong opinion: 273 (16.6%)

Mostly Negative: 111 (6.8%)

Negative: 38 (2.3%)

GeneticOpinionC

What’s your overall opinion of other people genetically modifying their children for cosmetic reasons?

Positive: 291 (17.7%)

Mostly Positive: 290 (17.7%)

No strong opinion: 576 (35.1%)

Mostly Negative: 328 (20.0%)

Negative: 157 (9.6%)

All three of these seem largely consistent with peoples personal preferences about modification. Were I inclined I could do a deeper analysis that actually takes survey respondents row by row and looks at correlation between preference for ones own children and preference for others.

Technological Unemployment

LudditeFallacy

Do you think the Luddite’s Fallacy is an actual fallacy?

Yes: 443 (30.936%)

No: 989 (69.064%)

We can use this as an overall measure of worry about technological unemployment, which would seem to be high among the LW demographic.

UnemploymentYear

By what year do you think the majority of people in your country will have trouble finding employment for automation related reasons? If you think this is something that will never happen leave this question blank.

Sum: 2203914.0

Mean: 2102.9713740458014

Median: 2050.0

Mode: 2050.0

Stdev: 1180.2342850727339

Question is flawed because you can’t distinguish answers of “never happen” from people who just didn’t see it.

Interesting question that would be fun to take a look at in comparison to the estimates for the singularity.

EndOfWork

Do you think the “end of work” would be a good thing?

Yes: 1238 (81.287%)

No: 285 (18.713%)

Fairly overwhelming consensus, but with a significant minority of people who have a dissenting opinion.

EndOfWorkConcerns

If machines end all or almost all employment, what are your biggest worries? Pick two.

Question Count Percent
People will just idle about in destructive ways 513 16.71%
People need work to be fulfilled and if we eliminate work we’ll all feel deep existential angst 543 17.687%
The rich are going to take all the resources for themselves and leave the rest of us to starve or live in poverty 1066 34.723%
The machines won’t need us, and we’ll starve to death or be otherwise liquidated 416 13.55%
Question is flawed because it demanded the user ‘pick two’ instead of up to two.

The plurality of worries are about elites who refuse to share their wealth.

Existential Risk

XRiskType

Which disaster do you think is most likely to wipe out greater than 90% of humanity before the year 2100?

Nuclear war: +4.800% 326 (20.6%)

Asteroid strike: −0.200% 64 (4.1%)

Unfriendly AI: +1.000% 271 (17.2%)

Nanotech /​ grey goo: −2.000% 18 (1.1%)

Pandemic (natural): +0.100% 120 (7.6%)

Pandemic (bioengineered): +1.900% 355 (22.5%)

Environmental collapse (including global warming): +1.500% 252 (16.0%)

Economic /​ political collapse: −1.400% 136 (8.6%)

Other: 35 (2.217%)

Significantly more people worried about Nuclear War than last year. Effect of new respondents, or geopolitical situation? Who knows.

Charity And Effective Altruism

Charitable Giving

Income

What is your approximate annual income in US dollars (non-Americans: convert at www.xe.com)? Obviously you don’t need to answer this question if you don’t want to. Please don’t include commas or dollar signs.

Sum: 66054140.47384

Mean: 64569.052271593355

Median: 40000.0

Mode: 30000.0

Stdev: 107297.53606321265

IncomeCharityPortion

How much money, in number of dollars, have you donated to charity over the past year? (non-Americans: convert to dollars at http://​​www.xe.com/​​ ). Please don’t include commas or dollar signs in your answer. For example, 4000

Sum: 2389900.6530000004

Mean: 2914.5129914634144

Median: 353.0

Mode: 100.0

Stdev: 9471.962766896671

XriskCharity

How much money have you donated to charities aiming to reduce existential risk (other than MIRI/​CFAR) in the past year?

Sum: 169300.89

Mean: 1991.7751764705883

Median: 200.0

Mode: 100.0

Stdev: 9219.941506342007

CharityDonations

How much have you donated in US dollars to the following charities in the past year? (Non-americans: convert to dollars at http://​​www.xe.com/​​) Please don’t include commas or dollar signs in your answer. Options starting with “any” aren’t the name of a charity but a category of charity.

Question Sum Mean Median Mode Stdev
Against Malaria Foundation 483935.027 1905.2560118110237 300.0 None 7216.020221941659
Schistosomiasis Control Initiative 47908.0 840.4912280701755 200.0 1000.0 1618.7858558941969
Deworm the World Initiative 28820.0 565.0980392156863 150.0 500.0 1432.7121867968033
GiveDirectly 154410.177 1429.723861111111 450.0 50.0 3472.082406463215
Any kind of animal rights charity 83130.47 1093.8219736842104 154.235 500.0 2313.493213799438
Any kind of bug rights charity 1083.0 270.75 157.5 None 353.39626011980755
Machine Intelligence Research Institute 141792.5 1417.925 100.0 100.0 5370.485697132622
Any charity combating nuclear existential risk 491.0 81.83333333333333 75.0 100.0 68.06002252913723
Any charity combating global warming 13012.0 245.50943396226415 100.0 10.0 365.5429903498414
Center For Applied Rationality 127101.0 3177.525 150.0 100.0 12969.096587929758
Strategies for Engineered Negligible Senescence Research Foundation 9429.0 554.6470588235294 100.0 20.0 1156.431847385335
Wikipedia 12765.5 53.18958333333333 20.0 10.0 126.4443520242066
Internet Archive 2975.04 80.40648648648649 30.0 50.0 173.7911012971898
Any campaign for political office 38443.99 366.1332380952381 50.0 50.0 1374.3058277660477
Other 564890.46 1661.4425294117648 200.0 100.0 4670.805041609166
“Bug Rights” charity was supposed to be a troll fakeout but apparently...

This table is interesting given the recent debates about how much money certain causes are ‘taking up’ in Effective Altruism.

Effective Altruism

Vegetarian

Do you follow any dietary restrictions related to animal products?

Yes, I am vegan: 54 (3.4%)

Yes, I am vegetarian: 158 (10.0%)

Yes, I restrict meat some other way (pescetarian, flexitarian, try to only eat ethically sourced meat): 375 (23.7%)

No: 996 (62.9%)

EAKnowledge

Do you know what Effective Altruism is?

Yes: 1562 (89.3%)

No but I’ve heard of it: 114 (6.5%)

No: 74 (4.2%)

EAIdentity

Do you self-identify as an Effective Altruist?

Yes: 665 (39.233%)

No: 1030 (60.767%)

The distribution given by the 2014 survey results does not sum to one, so it’s difficult to determine if Effective Altruism’s membership actually went up or not but if we take the numbers at face value it experienced an 11.13% increase in membership.

EACommunity

Do you participate in the Effective Altruism community?

Yes: 314 (18.427%)

No: 1390 (81.573%)

Same issue as last, taking the numbers at face value community participation went up by 5.727%

EADonations

Has Effective Altruism caused you to make donations you otherwise wouldn’t?

Yes: 666 (39.269%)

No: 1030 (60.731%)

Wowza!

Effective Altruist Anxiety

EAAnxiety

Have you ever had any kind of moral anxiety over Effective Altruism?

Yes: 501 (29.6%)

Yes but only because I worry about everything: 184 (10.9%)

No: 1008 (59.5%)


There’s an ongoing debate in Effective Altruism about what kind of rhetorical strategy is best for getting people on board and whether Effective Altruism is causing people significant moral anxiety.

It certainly appears to be. But is moral anxiety effective? Let’s look:

Sample Size: 244
Average amount of money donated by people anxious about EA who aren’t EAs: 257.5409836065574

Sample Size: 679
Average amount of money donated by people who aren’t anxious about EA who aren’t EAs: 479.7501384388807

Sample Size: 249 Average amount of money donated by EAs anxious about EA: 1841.5292369477913

Sample Size: 314
Average amount of money donated by EAs not anxious about EA: 1837.8248407643312

It seems fairly conclusive that anxiety is not a good way to get people to donate more than they already are, but is it a good way to get people to become Effective Altruists?

Sample Size: 1685
P(Effective Altruist): 0.3940652818991098
P(EA Anxiety): 0.29554896142433235
P(Effective Altruist | EA Anxiety): 0.5

Maybe. There is of course an argument to be made that sufficient good done by causing people anxiety outweighs feeding into peoples scrupulosity, but it can be discussed after I get through explaining it on the phone to wealthy PR-conscious donors and telling the local all-kill shelter where I want my shipment of dead kittens.

EAOpinion

What’s your overall opinion of Effective Altruism?

Positive: 809 (47.6%)

Mostly Positive: 535 (31.5%)

No strong opinion: 258 (15.2%)

Mostly Negative: 75 (4.4%)

Negative: 24 (1.4%)

EA appears to be doing a pretty good job of getting people to like them.

Interesting Tables

Charity Donations By Political Affilation
Affiliation Income Charity Contributions % Income Donated To Charity Total Survey Charity % Sample Size
Anarchist 1677900.0 72386.0 4.314% 3.004% 50
Communist 298700.0 19190.0 6.425% 0.796% 13
Conservative 1963000.04 62945.04 3.207% 2.612% 38
Futarchist 1497494.1099999999 166254.0 11.102% 6.899% 31
Left-Libertarian 9681635.613839999 416084.0 4.298% 17.266% 245
Libertarian 11698523.0 214101.0 1.83% 8.885% 190
Moderate 3225475.0 90518.0 2.806% 3.756% 67
Neoreactionary 1383976.0 30890.0 2.232% 1.282% 28
Objectivist 399000.0 1310.0 0.328% 0.054% 10
Other 3150618.0 85272.0 2.707% 3.539% 132
Pragmatist 5087007.609999999 266836.0 5.245% 11.073% 131
Progressive 8455500.440000001 368742.78 4.361% 15.302% 217
Social Democrat 8000266.54 218052.5 2.726% 9.049% 237
Socialist 2621693.66 78484.0 2.994% 3.257% 126
Number Of Effective Altruists In The Diaspora Communities
Community Count % In Community Sample Size
LessWrong 136 38.418% 354
LessWrong Meetups 109 50.463% 216
LessWrong Facebook Group 83 48.256% 172
LessWrong Slack 22 39.286% 56
SlateStarCodex 343 40.98% 837
Rationalist Tumblr 175 49.716% 352
Rationalist Facebook 89 58.94% 151
Rationalist Twitter 24 40.0% 60
Effective Altruism Hub 86 86.869% 99
Good Judgement(TM) Open 23 74.194% 31
PredictionBook 31 51.667% 60
Hacker News 91 35.968% 253
#lesswrong on freenode 19 24.675% 77
#slatestarcodex on freenode 9 24.324% 37
#chapelperilous on freenode 2 18.182% 11
/​r/​rational 117 42.545% 275
/​r/​HPMOR 110 47.414% 232
/​r/​SlateStarCodex 93 37.959% 245
One or more private ‘rationalist’ groups 91 47.15% 193
Effective Altruist Donations By Political Affiliation
Affiliation EA Income EA Charity Sample Size
Anarchist 761000.0 57500.0 18
Futarchist 559850.0 114830.0 15
Left-Libertarian 5332856.0 361975.0 112
Libertarian 2725390.0 114732.0 53
Moderate 583247.0 56495.0 22
Other 1428978.0 69950.0 49
Pragmatist 1442211.0 43780.0 43
Progressive 4004097.0 304337.78 107
Social Democrat 3423487.45 149199.0 93
Socialist 678360.0 34751.0 41
No comments.