I make no claim that Eliezer and/or the SIAI have anything like this in mind. It seems that they would like to build an absolutist AI. I find that very troubling.
If I thought they had settled on this and that they were likely to succeed I would probably feel it was very important to work to destroy them. I’m currently not sure about the first and think the second is highly unlikely so it is not a pressing concern.
If I thought they had settled on this and that they were likely to succeed I would probably feel it was very important to work to destroy them. I’m currently not sure about the first and think the second is highly unlikely so it is not a pressing concern.