I’m surprised you bring up Mikhail Gromov as a counterexample to Eliezer, considering that Gromov’s solution to existential risk, as presented in the quote above, can be paraphrased as: increase education so someone has a good idea on how to fix everything.
(Actual quote: “People must have ideas and they must prepare now. In two generations people must be educated. Teachers must be educated now, and then the teachers will educate a new generation. Then there will be sufficiently many people to face the difficulties. I am sure this will give a result.”)
If he doesn’t have any other concrete ideas, then I would think he’d recognize Eliezer as being a knowledgeable person with a potential solution fitting his criteria, and thus support him.
I don’t think that Gromov’s views and Eliezer’s views are necessarily incompatible.
My reading of Gromov’s quotation is that he does not have his eyes on a technological intelligence explosion and that the existential risk that he’s presently most concerned about is natural resource shortage.
This is in contrast with Eliezer who does have his eyes on a technological singularity and does not presently seem to be concerned about natural resource shortage.
I would be very interested in seeing Gromov study the evidence for a near-term intelligence explosion and seeing how this affects his views.
I may eventually approach him personally about this matter (although I hesitate to do so as I think that it’s important that whoever approach him on this point make a good first impression and I’m not sure that I’m in a good position to do so at the moment).
I’m surprised you bring up Mikhail Gromov as a counterexample to Eliezer, considering that Gromov’s solution to existential risk, as presented in the quote above, can be paraphrased as: increase education so someone has a good idea on how to fix everything.
(Actual quote: “People must have ideas and they must prepare now. In two generations people must be educated. Teachers must be educated now, and then the teachers will educate a new generation. Then there will be sufficiently many people to face the difficulties. I am sure this will give a result.”)
If he doesn’t have any other concrete ideas, then I would think he’d recognize Eliezer as being a knowledgeable person with a potential solution fitting his criteria, and thus support him.
I don’t think that Gromov’s views and Eliezer’s views are necessarily incompatible.
My reading of Gromov’s quotation is that he does not have his eyes on a technological intelligence explosion and that the existential risk that he’s presently most concerned about is natural resource shortage.
This is in contrast with Eliezer who does have his eyes on a technological singularity and does not presently seem to be concerned about natural resource shortage.
I would be very interested in seeing Gromov study the evidence for a near-term intelligence explosion and seeing how this affects his views.
I may eventually approach him personally about this matter (although I hesitate to do so as I think that it’s important that whoever approach him on this point make a good first impression and I’m not sure that I’m in a good position to do so at the moment).