I’m not sure I understand the case for this being so urgently important. A few ways I can think of that someone’s evaluation of AI risk might be affected by seeing this list:
They reason that science fiction does not reflect reality, therefore if something appears in science fiction, it will not happen in real life, and this list provides lots of counterexamples to that argument
Their absurdity heuristic, operating at the gut level, assigns extra absurdity to something they’ve seen in science fiction, so seeing this list will train their gut to see sci-fi stuff as a real possibility
This list makes them think that the base rate for sci-fi tech becoming real is high, so they take Terminator as evidence that AGI doom is likely
They think AGI worries are absurd for other reasons and that the only reason anyone takes AGI seriously is because they saw it in science fiction. They also think that the obvious silliness of Terminator makes belief in such scenarios less reasonable. This list reminds them that there’s a lot of silly sci-fi that convinced some people about future tech/threats, which nonetheless became real, at least in some limited sense.
My guess is that some people are doing something similar to 1 and 2, but they’re mostly not people I talk to. I’m not all that optimistic about such lists working for 1 or 2, but it seems worth trying. Even if 3 works, I do not think we should encourage it, because it is terrible reasoning. I think 4 is kind of common, and sharing this list might help. But I think the more important issue there is the “they think AGI worries are absurd for other reasons” part.
The reason I don’t find this list very compelling is that I don’t think you can look at just a list of technologies that were mentioned in sci-fi in some way before they were real and learn very much about reality. The details are important and looking at the directed energy weapons in The War of the Worlds and comparing it to actual lasers doesn’t feel to me like an update toward “the future will be like Terminator”.
(To be clear, I do think it’s worthwhile to see how predictions about future technology have fared, and I think sci-fi should be part of that)
I’m not sure I understand the case for this being so urgently important. A few ways I can think of that someone’s evaluation of AI risk might be affected by seeing this list:
They reason that science fiction does not reflect reality, therefore if something appears in science fiction, it will not happen in real life, and this list provides lots of counterexamples to that argument
Their absurdity heuristic, operating at the gut level, assigns extra absurdity to something they’ve seen in science fiction, so seeing this list will train their gut to see sci-fi stuff as a real possibility
This list makes them think that the base rate for sci-fi tech becoming real is high, so they take Terminator as evidence that AGI doom is likely
They think AGI worries are absurd for other reasons and that the only reason anyone takes AGI seriously is because they saw it in science fiction. They also think that the obvious silliness of Terminator makes belief in such scenarios less reasonable. This list reminds them that there’s a lot of silly sci-fi that convinced some people about future tech/threats, which nonetheless became real, at least in some limited sense.
My guess is that some people are doing something similar to 1 and 2, but they’re mostly not people I talk to. I’m not all that optimistic about such lists working for 1 or 2, but it seems worth trying. Even if 3 works, I do not think we should encourage it, because it is terrible reasoning. I think 4 is kind of common, and sharing this list might help. But I think the more important issue there is the “they think AGI worries are absurd for other reasons” part.
The reason I don’t find this list very compelling is that I don’t think you can look at just a list of technologies that were mentioned in sci-fi in some way before they were real and learn very much about reality. The details are important and looking at the directed energy weapons in The War of the Worlds and comparing it to actual lasers doesn’t feel to me like an update toward “the future will be like Terminator”.
(To be clear, I do think it’s worthwhile to see how predictions about future technology have fared, and I think sci-fi should be part of that)