I basically don’t buy the conjecture of humans being super-cooperative in the long run, or hatred decreasing and love increasing.
To the extent that something like this is true, I expect it to be a weird industrial to information age relic that utterly shatters if AGI/ASI is developed, and this remains true even if the AGI is aligned to a human.
I basically don’t buy the conjecture of humans being super-cooperative in the long run, or hatred decreasing and love increasing.
To the extent that something like this is true, I expect it to be a weird industrial to information age relic that utterly shatters if AGI/ASI is developed, and this remains true even if the AGI is aligned to a human.
So just don’t make an AGI, instead do human intelligence amplification.