Thanks for putting this together! Two suggestions:
Keep doing this. It seems valuable.
Can you talk Type III Audio into reading it?Already done.
Thanks for putting this together! Two suggestions:
Keep doing this. It seems valuable.
Can you talk Type III Audio into reading it? Already done.
Thanks for another thought provoking post. This is quite timely for me, as I’ve been thinking a lot about the difference between the work of futurists as compared to forecasters.
These are people who thought a lot about science and the future, and made lots of predictions about future technologies—but they’re famous for how entertaining their fiction was at the time, not how good their nonfiction predictions look in hindsight. I selected them by vaguely remembering that “the Big Three of science fiction” is a thing people say sometimes, googling it, and going with who came up—no hunting around for lots of sci-fi authors and picking the best or worst.
I think this is a clever way to try to avoid hindsight bias in selecting your futurists, but I think it’s at least plausible that only reasonably good futurists could rise to the status of “the Big Three of science fiction”. I’m assuming that the status is granted only several decades after the main corpus has been written and that reasonably good predictions (within the fiction) would help enormously in attaining it. On the other hand, imagine writers whose fiction became increasingly ridiculous as the future progressed because they did not make good predictions.[1] Surely it would be very difficult for such authors to become part of the science fiction elite.
I’m not at all certain of this argument and would like to understand more about how cultural works move the “popular at release” to “classic” status.
At any rate, I think we should be at least moderately concerned that there could still be significant selection bias in the group being analyzed.
For example, I would put C.S. Lewis’ space trilogy in this category. They were good books and a forceful argument against the worst sorts of consequentialism, but imo, they were not great science fiction. Primarily because the way he imagines space and life on other planets seems completely ridiculous now.
Mentorship is one of the most frequently requested services that AI Safety Quest sees when conducting Navigation calls. I hope this service can help bridge the gap between “I want to do something about AI safety” and “I’m working on a meaningful AIS project”. Many thanks to you both for making this happen.