If one believes the orthogonality thesis
Yes, I do.
I would expect an AGI to be more on the far-reaching/utilitarian end of affecting our lives.
Me too.
But I’m adopting the term “AGI-humans” from today.
But of course a utilitarian-leaning AGI will be more willing to risk actively doing harm if it thinks that the total expected outcome is improved.
...
Yes, I do.
Me too.
But I’m adopting the term “AGI-humans” from today.
...