First part. It seems we agree! I just consider that A is more likely because you are already in a world where you can use those AGIs to produce results. This is what a pivotal act would look like. EY et al would argue, this is not going to happen because the first machine will already kill you. What I am criticizing is the position in the community where it is taking for granted that AGI = doom
Second part, I also like that scenario! I don’t consider especially unlikely that an AGi would try to survive like that. But watch out, you can’t really derive from here that machine will have the capacity of killing humanity. Only that a machine might try to survive like this. If you want to continue with the Bitcoin analogy, nothing prevents me from forking the code and create Litecoin, and tune the utility function to make it work for me
First part. It seems we agree! I just consider that A is more likely because you are already in a world where you can use those AGIs to produce results. This is what a pivotal act would look like. EY et al would argue, this is not going to happen because the first machine will already kill you. What I am criticizing is the position in the community where it is taking for granted that AGI = doom
Second part, I also like that scenario! I don’t consider especially unlikely that an AGi would try to survive like that. But watch out, you can’t really derive from here that machine will have the capacity of killing humanity. Only that a machine might try to survive like this. If you want to continue with the Bitcoin analogy, nothing prevents me from forking the code and create Litecoin, and tune the utility function to make it work for me