People tend to hear about the group from co-workers (usually at tech companies) or through a blog called LessWrong, associated with the artificial-intelligence researcher Eliezer Yudkowsky, who is also the author of the popular fan-fiction novel ‘‘Harry Potter and the Methods of Rationality.’’ (Yudkowsky founded the Machine Intelligence Research Institute (MIRI), which provided the original funding for CFAR; the two groups share an office space in Berkeley.) Yudkowsky is a controversial figure. Mostly self-taught — he left school after eighth grade — he has written openly about polyamory and blogged at length about the threat of a civilization-ending A.I. Despite this, CFAR’s sessions have become popular.
I think this is a fair description and the eye-rolling is really the “despite this” which is comprehensible. More strong is this:
Compulsive and rather grandiose, Yudkowsky is known for proclaiming the imminence of the A.I. apocalypse (‘‘I wouldn’t be surprised if tomorrow was the Final Dawn, the last sunrise before the earth and sun are reshaped into computing elements’’) and his own role as savior (‘‘I think my efforts could spell the difference between life and death for most of humanity’’).
The eye-rolling:
I think this is a fair description and the eye-rolling is really the “despite this” which is comprehensible. More strong is this:
but backs it up with quotes.
Note that these quotes are from 2001 or earlier.
Also, new pet peeve: quotes databases that don’t provide sources. This one even tells you how you can cite itself, as if that gives it any authority.
Yup, EY brought enough rope, no reason to spin more.