Ehhh, I get the impression that Schidhuber doesn’t think of human extinction as specifically “part of the plan”, but he also doesn’t appear to consider human survival to be something particularly important relative to his priority of creating ASI. He wants “to build something smarter than myself, which will build something even smarter, et cetera, et cetera, and eventually colonize and transform the universe”, and thinks that “Generally speaking, our best protection will be their lack of interest in us, because most species’ biggest enemy is their own kind. They will pay about as much attention to us as we do to ants.”
I agree that he’s not overtly “pro-extinction” in the way Rich Sutton is, but he does seem fairly dismissive of humanity’s long-term future in general, while also pushing for the creation of an uncaring non-human thing to take over the universe, so...
Ehhh, I get the impression that Schidhuber doesn’t think of human extinction as specifically “part of the plan”, but he also doesn’t appear to consider human survival to be something particularly important relative to his priority of creating ASI. He wants “to build something smarter than myself, which will build something even smarter, et cetera, et cetera, and eventually colonize and transform the universe”, and thinks that “Generally speaking, our best protection will be their lack of interest in us, because most species’ biggest enemy is their own kind. They will pay about as much attention to us as we do to ants.”
I agree that he’s not overtly “pro-extinction” in the way Rich Sutton is, but he does seem fairly dismissive of humanity’s long-term future in general, while also pushing for the creation of an uncaring non-human thing to take over the universe, so...