On anti-heroic epistemology (non-virtuous to attempt to discriminate within an outside view) there shouldn’t be any impossible successes by anyone you know personally after you met them.
I don’t understand why you say this. Given Carl’s IQ and social circle (didn’t he used to work for a hedge fund run by Peter Thiel?) why would it be very surprising that someone he personally knows achieves your current level of success after he meets them?
They should only happen to other people selected post-facto by the media, or to people who you met because of their previous success.
Carl referenced “Staring Into the Singularity” as an early indicator of your extraordinary verbal abilities (which explains much if not all of your subsequent successes). It suggests that’s how you initially attracted his attention. The same is certainly true for me. I distinctly recall saying to myself “I should definitely keep track of this guy” when I read that, back in the extropian days. Is that enough for you to count as “people who you met because of their previous success”?
In any case, almost everyone who meets you now would count you as such. What arguments can you give to them that “heroic epistemology” is normative (and hence they are justified in donating to MIRI)?
To state my overall position on the topic being discussed, I think according to “non-heroic epistemology”, after someone achieves an “impossible success”, you update towards them being able to achieve further successes of roughly the same difficulty and in related fields that use similar skills, but the posterior probabilities of them solving much more difficult problems or in fields that use very different skills remain low (higher relative to the prior, but still low in an absolute sense). Given my understanding of the distribution of cognitive abilities in humans, I don’t see why I would ever “give up” this epistemology, unless you achieved a level of success that made me suspect that you’re an alien avatar or something.
In any case, almost everyone who meets you now would count you as such. What arguments can you give to them that “heroic epistemology” is normative (and hence they are justified in donating to MIRI)?
Yes, no matter how many impossible things you do, the next person you meet thinks that they only heard of you because of them, ergo selection bias. This is an interesting question purely on a philosophical level—it seems to me to have some of the flavor of quantum suicide experiments where you can’t communicate your evidence. In principle this shouldn’t happen without quantum suicide for logically omniscient entities who already know the exact fraction of people with various characteristics, i.e., agree on exact priors, but I think it might start happening again to people who are logically unsure about which framework they should use.
I don’t understand why you say this. Given Carl’s IQ and social circle (didn’t he used to work for a hedge fund run by Peter Thiel?) why would it be very surprising that someone he personally knows achieves your current level of success after he meets them?
Carl referenced “Staring Into the Singularity” as an early indicator of your extraordinary verbal abilities (which explains much if not all of your subsequent successes). It suggests that’s how you initially attracted his attention. The same is certainly true for me. I distinctly recall saying to myself “I should definitely keep track of this guy” when I read that, back in the extropian days. Is that enough for you to count as “people who you met because of their previous success”?
In any case, almost everyone who meets you now would count you as such. What arguments can you give to them that “heroic epistemology” is normative (and hence they are justified in donating to MIRI)?
To state my overall position on the topic being discussed, I think according to “non-heroic epistemology”, after someone achieves an “impossible success”, you update towards them being able to achieve further successes of roughly the same difficulty and in related fields that use similar skills, but the posterior probabilities of them solving much more difficult problems or in fields that use very different skills remain low (higher relative to the prior, but still low in an absolute sense). Given my understanding of the distribution of cognitive abilities in humans, I don’t see why I would ever “give up” this epistemology, unless you achieved a level of success that made me suspect that you’re an alien avatar or something.
Yes, no matter how many impossible things you do, the next person you meet thinks that they only heard of you because of them, ergo selection bias. This is an interesting question purely on a philosophical level—it seems to me to have some of the flavor of quantum suicide experiments where you can’t communicate your evidence. In principle this shouldn’t happen without quantum suicide for logically omniscient entities who already know the exact fraction of people with various characteristics, i.e., agree on exact priors, but I think it might start happening again to people who are logically unsure about which framework they should use.