Just to clarify—by “1% machine” do you mean a machine which serves (the most powerful) 1% of humanity?
There’s definitely a values issue as to how undesirable such an outcome would be compared to human extinction. I think there’s also substantial disagreement between Bill & Luke about the relative probabilities of those outcomes though.
(As we’ve seen from the Hanson/Yudkowsky foom debate, drilling down to find the root cause of that kind of disagreement is really hard).
Just to clarify—by “1% machine” do you mean a machine which serves (the most powerful) 1% of humanity?
There’s definitely a values issue as to how undesirable such an outcome would be compared to human extinction. I think there’s also substantial disagreement between Bill & Luke about the relative probabilities of those outcomes though.
(As we’ve seen from the Hanson/Yudkowsky foom debate, drilling down to find the root cause of that kind of disagreement is really hard).
Yes, that’s right.