I wonder if the article will increase Alcor’s membership? As “Why have so few people signed up for cryonics” is a big mystery for cryonics supporters such as myself we should use the opportunity of the article to make predictions about the article’s impact. I predict that the article will boost Alcor’s membership over the next year by 10% above trend which basically means membership will be 10% higher a year from now than it is currently.
EDIT: I predict Alcor’s membership will be 11% higher a year from now than it is today. Sorry for the poorly written comment above.
Are those two 10% figures equal only by coincidence?
To me, “boost membership by 10% above trend” means either “increase this year’s signups by 10% of what they would otherwise have been” or else “increase this year’s signups enough to make membership a year from now 10% higher than it otherwise would have been”.
The second of these is equivalent to “membership will be 10% higher a year from now” iff membership would otherwise have been exactly unaltered over the year, which would mean that signups are a negligibly small fraction of current membership.
The first is equivalent to “membership will be 10% higher a year from now” iff m+1.1s = 1.1m where m,s are current membership and baseline signups for the next year, which is true iff m = 11s.
Those are both rather specific conditions, and the first seems pretty unlikely. Did you actually mean either of them, or have I misunderstood?
I am reading the grandparent literally as “increase membership” which does imply that the current trend is flat and the membership numbers are not increasing.
Could be. But is Alcor really doing so badly? (Or: does James_Miller think they are?)
The graphs on this Alcor page seem to indicate that membership is in fact increasing by at least a few percent year on year, even if people are no longer counted as members after cryosuspension.
My understanding is that the number of people signed up is in the thousands, which if it is correct means probably a bit less than one in a million persons.
You might have meant it rhetorically, but if it is true that it is a “big mystery” to you why most people have not signed up, then your best guess for the reason for this should be that signing up for cryonics is foolish and useless, just as if a patient in a psychological ward finds himself thinking, “I wonder why so few people say they are Napoleon?”, his best guess should be that the reason for this is that the people he knows, including himself, are not in fact Napoleon.
As another example, if you are at the airport and you see two lines while you are checking in, a very long one and a very short one, and you say, “It’s a big mystery to me why so many people are going in that long line instead of the short one,” then you’d better get in that long line, because if you get in the short one, you are going to find yourself kicked out of it. On the other hand if you do know the reasons, you may be able to get in the short line.
In the cryonics case, this is pretty much true no matter how convincing you find your reasons, until you can understand why people do not sign up.
But the intellectual quality of some of the people who have signed up for cryonics is exceptionally high (Hanson, Thiel, Kurzweil, Eliezer). Among the set of people who thought they were Napoleon (excluding the original), I doubt you would find many who had racked up impressive achievements.
if you are at the airport and you see two lines while you are checking in, a very long one and a very short one, and you say, “It’s a big mystery to me why so many people are going in that long line instead of the short one,” then you’d better get in that long line, because if you get in the short one, you are going to find yourself kicked out of it.
What if you see Hanson, Thiel, Kurzweil, and Eliezer in the short line, ask them if you should get in the short line, and they say yes?
“What if you see Hanson, Thiel, Kurzweil, and Eliezer in the short line, ask them if you should get in the short line, and they say yes?”
As I pointed at last time you brought this up,these people aren’t just famous for being smart, they’re also famous for being contrarians and futurists. Cryonics is precisely an area in which you’d expect them to make a bad bet, because it’s seen as weird and it’s futuristic.
This depends on whether you model contrarianism and futurism as a bias (‘Hanson is especially untrustworthy about futurist topics, since he works in the area’) v. modeling contrarianism and futurism as skills one can train or bodies of knowledge one can learn (‘Hanson is especially trustworthy about futurist topics, since he works in the area’).
My typical heuristic for reliable experts (taken from Thinking Fast and Slow I think) is that if experts have tight, reliable feedback loops, they tend to be more trustworthy. Futurism obviously fails this test. Contrarianism isn’t really a “field” in itself, and I tend to think of it more as a bias… although EY would obviously disagree.
My typical heuristic for reliable experts (taken from Thinking Fast and Slow I think) is that if experts have tight, reliable feedback loops, they tend to be more trustworthy. Futurism obviously fails this test.
Then it might be that futurism is irrelevant, rather than being expertise-like or bias-like. (Unless we think ‘studying X while lacking tight, reliable feedback loops’ in this context is worse than ‘neither studying X nor having tight, reliable feedback loops.’)
Contrarianism isn’t really a “field” in itself, and I tend to think of it more as a bias...
Thiel, Yudkowsky, Hanson, etc. use “contrarian” to mean someone who disagrees with mainstream views. Most contrarians are wrong, though correct contrarians are more impressive than correct conformists (because it’s harder to be right about topics where the mainstream is wrong).
Then it might be that futurism is irrelevant, rather than being expertise-like or bias-like. (Unless we think ‘studying X while lacking tight, reliable feedback loops’ in this context is worse than ‘neither studying X nor having tight, reliable feedback loops.’)
In this case futurism is two things in these people:
A belief in expertise about the future.
A tendency towards optimism about the future.
Combined, these mean that these people both think cryonics will work in the future, and are more confident in this assertion than warranted.
Thiel, Yudkowsky, Hanson, etc. use “contrarian” to mean someone who disagrees with mainstream views.
I don’t think so… it’s more someone who has the tendency(in the sense of an aesthetic preference) to disagree with mainstream views. In this case, they would tend to be drawn towards cryonics because it’s out of the mainstream, which should give us less confidence that they’re drawn towards cryonics because it’s correct.
One of the most common ways they use the word “contrarian” is to refer to beliefs that are rejected by the mainstream, for whatever reason; by extension, contrarian people are people who hold contrarian beliefs. (E.g., Galileo is a standard example of a “correct contrarian” whether his primary motivation was rebelling against the establishment or discovering truth.) “Aesthetic preference” contrarianism is a separate idea; I don’t think it matters which definition we use for “contrarianism”.
I think it matters in this context. If these people are contrarian simply because they happen to have lots of different views, then it’s irrelevant that they’re contrarian. If they’re contrarian because they’re DRAWN towards contrarian views, it means they’re biased towards cryonics.
I agree it matters in this case, but it doesn’t matter whether we use the word “contrarianism” vs. tabooing it.
Also, your summary assumes one of the points under dispute: whether it’s possible to be good at arriving at true non-mainstream beliefs (‘correct contrarianism’), or whether people who repeatedly outperform the mainstream are just lucky. ‘Incorrect contrarianism’ and ‘correct-by-coincidence contrarianism’ aren’t the only two possibilities.
1a. If you believe futurists have more expertise on the future, then they are more likely to be correct about cryonics.
1b. If you believe expertise needs tight feedback loops, they are less likely to be correct about cryonics.
1c. If you believe futurists are drawn towards optimistic views about they future, they are less likely to be correct about cryonics.
2.These people are contrarians
2a. If you believe they have a “correct contrarian cluster” of views, they are more likely to be correct about cryonics.
2b. If you believe that they arrived at contrarian views by chance, they are no more or less likely to be correct about cryonics.
2c. If you believe that they arrived at contrarian views because they are drawn to contrarian views, they are less likely to be correct about cryonics.
I believe 1b, 1c, and 2c. You believe 1a and 2a. Is that correct?
But the average is lower, and not signing up for cryonics is a “default” action: you don’t have to expend thought or effort in order to not be signed up for cryonics. A more relevant comparison might be to people who have written refutations or rejections of cryonics.
I don’t think the average matters, it’s the right tail of the distribution that’s important.
Take, say, people with 130+ IQ—that’s about 2.5% of your standard white population and the overwhelming majority of them are not signed up. In fact, in any IQ quantile only a miniscule fraction has signed up.
entirelyuseless made the point that low cryonics use rates in the general population are evidence against the effectiveness of cryonics. James Miller responded by citing evidence supporting cryonics: that cryonicists are disproportionately intelligent/capable/well-informed. If your response to James is just that very few people have signed up for cryonics, then that’s restating entirelyuseless’ point. “The intellectual quality of some people who have NOT signed up for cryonics is exceptionally high” would be true even in a world where every cryonicist were more intelligent than every non-cryonicist, just given how few cryonicists there are.
entirelyuseless made the point that low cryonics use rates in the general population are evidence against the effectiveness of cryonics
No, I don’t think he did. The claim that low uptake rate is evidence against the effectiveness of cryonics is nonsense on stilts. entirelyuseless’ point was that if you are in a tiny minority and you don’t understand why the great majority doesn’t join you, your understanding of the situation is… limited.
James Miller countered by implying that this problem can be solved if one assumes that it’s the elite (IQ giants, possessors of secret gnostic knowledge, etc.) which signs up for cryonics and the vast majority of the population is just too stupid to take a great deal when it sees it.
My counter-counter was that you can pick any measure by which to choose your elite (e.g. IQ) and still find that only a miniscule fraction of that elite chose cryonics—which means that the “just ignore the stupid and look at the smart ones” argument does not work.
Someone who mistakenly believes that he is Napoleon presumably thinks that he himself is impressive intellectually, and in the artificial example I was discussing, he would think that others who believe the same thing are also impressive. However, it’s also true that outside observers would not admit that, and in the cryonics case many people would, so in this respect the cryonics case is much more favorable than the Napoleon example. However, as Lumifer pointed out, this is not a terribly strong positive argument, given that you will be able to find equally intelligent people who have not signed up for cryonics.
In the Hanson etc airport situation, I would at least ask them why everyone else is in the long line, and if they had no idea then I would be pretty suspicious. In the cryonics case, in reality, I would expect that they would at least have some explanation, but whether it would be right or not is another matter. Ettinger at least thought that his proposal would become widely accepted rather quickly, and seems to have been pretty disappointed that it was not.
In any case, I wasn’t necessarily saying that signing up for cryonics is a bad thing, just that it seems like a situation where you should understand why other people don’t, before you do it yourself.
Yes, I meant including other groups. It might be around 2,000 or so total but I didn’t want to assert that it is that low because I don’t know that for a fact.
But the logic that makes signing up for cryonics make sense is the same logic that humans are REALLY BAD AT doing. Following the crowd is generally a good heuristic, but you have to recognize it’s limitations.
I’m impressed at how positively the author portrayed cryonicists. The parts which described the mishaps which occurred during/before the freezing process were especially moving.
Probably the biggest cryonics story of the year. In the print edition of The New York Times, it appeared on the front page, above the fold.
A Dying Young Woman’s Hope in Cryonics and a Future, by Amy Harmon
http://www.nytimes.com/2015/09/13/us/cancer-immortality-cryogenics.html
You can also watch a short documentary about Miss Suozzi here:
http://www.nytimes.com/video/science/100000003897597/kim-suozzis-last-wishes.html
Yet som there be that by due steps aspire
To lay their just hands on that Golden Key
That ope’s the Palace of Eternity.
(John Milton, Comus, lines 12-14)
May Kim find that Golden Key some day.
I wonder if the article will increase Alcor’s membership? As “Why have so few people signed up for cryonics” is a big mystery for cryonics supporters such as myself we should use the opportunity of the article to make predictions about the article’s impact. I predict that the article will boost Alcor’s membership over the next year by 10% above trend which basically means membership will be 10% higher a year from now than it is currently.
EDIT: I predict Alcor’s membership will be 11% higher a year from now than it is today. Sorry for the poorly written comment above.
Are those two 10% figures equal only by coincidence?
To me, “boost membership by 10% above trend” means either “increase this year’s signups by 10% of what they would otherwise have been” or else “increase this year’s signups enough to make membership a year from now 10% higher than it otherwise would have been”.
The second of these is equivalent to “membership will be 10% higher a year from now” iff membership would otherwise have been exactly unaltered over the year, which would mean that signups are a negligibly small fraction of current membership.
The first is equivalent to “membership will be 10% higher a year from now” iff m+1.1s = 1.1m where m,s are current membership and baseline signups for the next year, which is true iff m = 11s.
Those are both rather specific conditions, and the first seems pretty unlikely. Did you actually mean either of them, or have I misunderstood?
I am reading the grandparent literally as “increase membership” which does imply that the current trend is flat and the membership numbers are not increasing.
Could be. But is Alcor really doing so badly? (Or: does James_Miller think they are?)
The graphs on this Alcor page seem to indicate that membership is in fact increasing by at least a few percent year on year, even if people are no longer counted as members after cryosuspension.
Hm. Yes, Alcor’s membership is going up nicely. I don’t know what James_Miller had in mind, then.
I made this into a prediction on PredictionBook.
Is the relevant data publically accessible?
Yes, the data is online.
My understanding is that the number of people signed up is in the thousands, which if it is correct means probably a bit less than one in a million persons.
You might have meant it rhetorically, but if it is true that it is a “big mystery” to you why most people have not signed up, then your best guess for the reason for this should be that signing up for cryonics is foolish and useless, just as if a patient in a psychological ward finds himself thinking, “I wonder why so few people say they are Napoleon?”, his best guess should be that the reason for this is that the people he knows, including himself, are not in fact Napoleon.
As another example, if you are at the airport and you see two lines while you are checking in, a very long one and a very short one, and you say, “It’s a big mystery to me why so many people are going in that long line instead of the short one,” then you’d better get in that long line, because if you get in the short one, you are going to find yourself kicked out of it. On the other hand if you do know the reasons, you may be able to get in the short line.
In the cryonics case, this is pretty much true no matter how convincing you find your reasons, until you can understand why people do not sign up.
But the intellectual quality of some of the people who have signed up for cryonics is exceptionally high (Hanson, Thiel, Kurzweil, Eliezer). Among the set of people who thought they were Napoleon (excluding the original), I doubt you would find many who had racked up impressive achievements.
What if you see Hanson, Thiel, Kurzweil, and Eliezer in the short line, ask them if you should get in the short line, and they say yes?
“What if you see Hanson, Thiel, Kurzweil, and Eliezer in the short line, ask them if you should get in the short line, and they say yes?”
As I pointed at last time you brought this up,these people aren’t just famous for being smart, they’re also famous for being contrarians and futurists. Cryonics is precisely an area in which you’d expect them to make a bad bet, because it’s seen as weird and it’s futuristic.
This depends on whether you model contrarianism and futurism as a bias (‘Hanson is especially untrustworthy about futurist topics, since he works in the area’) v. modeling contrarianism and futurism as skills one can train or bodies of knowledge one can learn (‘Hanson is especially trustworthy about futurist topics, since he works in the area’).
My typical heuristic for reliable experts (taken from Thinking Fast and Slow I think) is that if experts have tight, reliable feedback loops, they tend to be more trustworthy. Futurism obviously fails this test. Contrarianism isn’t really a “field” in itself, and I tend to think of it more as a bias… although EY would obviously disagree.
Then it might be that futurism is irrelevant, rather than being expertise-like or bias-like. (Unless we think ‘studying X while lacking tight, reliable feedback loops’ in this context is worse than ‘neither studying X nor having tight, reliable feedback loops.’)
Thiel, Yudkowsky, Hanson, etc. use “contrarian” to mean someone who disagrees with mainstream views. Most contrarians are wrong, though correct contrarians are more impressive than correct conformists (because it’s harder to be right about topics where the mainstream is wrong).
In this case futurism is two things in these people:
A belief in expertise about the future.
A tendency towards optimism about the future. Combined, these mean that these people both think cryonics will work in the future, and are more confident in this assertion than warranted.
I don’t think so… it’s more someone who has the tendency(in the sense of an aesthetic preference) to disagree with mainstream views. In this case, they would tend to be drawn towards cryonics because it’s out of the mainstream, which should give us less confidence that they’re drawn towards cryonics because it’s correct.
One of the most common ways they use the word “contrarian” is to refer to beliefs that are rejected by the mainstream, for whatever reason; by extension, contrarian people are people who hold contrarian beliefs. (E.g., Galileo is a standard example of a “correct contrarian” whether his primary motivation was rebelling against the establishment or discovering truth.) “Aesthetic preference” contrarianism is a separate idea; I don’t think it matters which definition we use for “contrarianism”.
I think it matters in this context. If these people are contrarian simply because they happen to have lots of different views, then it’s irrelevant that they’re contrarian. If they’re contrarian because they’re DRAWN towards contrarian views, it means they’re biased towards cryonics.
I agree it matters in this case, but it doesn’t matter whether we use the word “contrarianism” vs. tabooing it.
Also, your summary assumes one of the points under dispute: whether it’s possible to be good at arriving at true non-mainstream beliefs (‘correct contrarianism’), or whether people who repeatedly outperform the mainstream are just lucky. ‘Incorrect contrarianism’ and ‘correct-by-coincidence contrarianism’ aren’t the only two possibilities.
Ok, so to summarize:
These people are futurists.
1a. If you believe futurists have more expertise on the future, then they are more likely to be correct about cryonics.
1b. If you believe expertise needs tight feedback loops, they are less likely to be correct about cryonics.
1c. If you believe futurists are drawn towards optimistic views about they future, they are less likely to be correct about cryonics.
2.These people are contrarians
2a. If you believe they have a “correct contrarian cluster” of views, they are more likely to be correct about cryonics.
2b. If you believe that they arrived at contrarian views by chance, they are no more or less likely to be correct about cryonics.
2c. If you believe that they arrived at contrarian views because they are drawn to contrarian views, they are less likely to be correct about cryonics.
I believe 1b, 1c, and 2c. You believe 1a and 2a. Is that correct?
The intellectual quality of some people who have NOT signed up for cryonics is exceptionally high as well.
But the average is lower, and not signing up for cryonics is a “default” action: you don’t have to expend thought or effort in order to not be signed up for cryonics. A more relevant comparison might be to people who have written refutations or rejections of cryonics.
I don’t think the average matters, it’s the right tail of the distribution that’s important.
Take, say, people with 130+ IQ—that’s about 2.5% of your standard white population and the overwhelming majority of them are not signed up. In fact, in any IQ quantile only a miniscule fraction has signed up.
entirelyuseless made the point that low cryonics use rates in the general population are evidence against the effectiveness of cryonics. James Miller responded by citing evidence supporting cryonics: that cryonicists are disproportionately intelligent/capable/well-informed. If your response to James is just that very few people have signed up for cryonics, then that’s restating entirelyuseless’ point. “The intellectual quality of some people who have NOT signed up for cryonics is exceptionally high” would be true even in a world where every cryonicist were more intelligent than every non-cryonicist, just given how few cryonicists there are.
No, I don’t think he did. The claim that low uptake rate is evidence against the effectiveness of cryonics is nonsense on stilts. entirelyuseless’ point was that if you are in a tiny minority and you don’t understand why the great majority doesn’t join you, your understanding of the situation is… limited.
James Miller countered by implying that this problem can be solved if one assumes that it’s the elite (IQ giants, possessors of secret gnostic knowledge, etc.) which signs up for cryonics and the vast majority of the population is just too stupid to take a great deal when it sees it.
My counter-counter was that you can pick any measure by which to choose your elite (e.g. IQ) and still find that only a miniscule fraction of that elite chose cryonics—which means that the “just ignore the stupid and look at the smart ones” argument does not work.
Someone who mistakenly believes that he is Napoleon presumably thinks that he himself is impressive intellectually, and in the artificial example I was discussing, he would think that others who believe the same thing are also impressive. However, it’s also true that outside observers would not admit that, and in the cryonics case many people would, so in this respect the cryonics case is much more favorable than the Napoleon example. However, as Lumifer pointed out, this is not a terribly strong positive argument, given that you will be able to find equally intelligent people who have not signed up for cryonics.
In the Hanson etc airport situation, I would at least ask them why everyone else is in the long line, and if they had no idea then I would be pretty suspicious. In the cryonics case, in reality, I would expect that they would at least have some explanation, but whether it would be right or not is another matter. Ettinger at least thought that his proposal would become widely accepted rather quickly, and seems to have been pretty disappointed that it was not.
In any case, I wasn’t necessarily saying that signing up for cryonics is a bad thing, just that it seems like a situation where you should understand why other people don’t, before you do it yourself.
gjm posted a link to the data: Alcor says it has about 1,000 members at the moment.
Yes, I meant including other groups. It might be around 2,000 or so total but I didn’t want to assert that it is that low because I don’t know that for a fact.
But the logic that makes signing up for cryonics make sense is the same logic that humans are REALLY BAD AT doing. Following the crowd is generally a good heuristic, but you have to recognize it’s limitations.
In principle this is saying that you know why most people don’t sign up, so if you’re right about that, then my argument doesn’t apply to your case.
I’m impressed at how positively the author portrayed cryonicists. The parts which described the mishaps which occurred during/before the freezing process were especially moving.
The article discusses the Brain Preservation Foundation. The BPF has responded here:
A COURAGEOUS STORY OF BRAIN PRESERVATION, “DYING YOUNG” BY AMY HARMON, THE NEW YORK TIMES.
http://www.brainpreservation.org/a-courageous-story-of-brain-preservation-dying-young-by-amy-harmon-the-new-york-times/