Q: What do you consider the biggest threat to humanity?
A: Last Q&A video I mentioned opinions and how to change them. The hardest changes are the ones where you’re invested in the idea, and I’ve been a techno-optimist 100% all of my life, but [Superintelligence: Paths, Dangers, Strategies] put a real asterisk on that in a way I didn’t want. And now Artificial Intelligence is on my near term threat list in a deeply unwelcome way. But it would be self-delusional to ignore a convincing argument because I don’t want it to be true.
I like this how this response describes motivated cognition, the difficulty of changing your mind and the Litany of Gendlin.
He also apparently discusses this topic on his podcast, and links to the amazon page for the book in the description of the video.
Grey’s video about technological unemployment was pretty big when it came out, and it seemed to me at the time that he wasn’t too far off of realising that there were other implications of increasing AI capability that were rather plausible as well, so it’s cool to see that it happened.
CGP Grey has read Bostrom’s Superintelligence.
Transcript of the relevant section:
I like this how this response describes motivated cognition, the difficulty of changing your mind and the Litany of Gendlin.
He also apparently discusses this topic on his podcast, and links to the amazon page for the book in the description of the video.
Grey’s video about technological unemployment was pretty big when it came out, and it seemed to me at the time that he wasn’t too far off of realising that there were other implications of increasing AI capability that were rather plausible as well, so it’s cool to see that it happened.