To be honest I’m just as afraid of aligned AGI as of unaligned AGI. An AGI aligned with the values of the PRC seems like a nightmare. If it’s aligned with the US army it’s only really bad, and Yudkowsky dath illan is not exactly the world I want to live in either...
I don’t agree, because a world of misaligned AI is known to be really bad. Whereas a world of AI successfully aligned by some opposing faction probably has a lot in common with your own values
Extreme case: ISIS successfully builds the first aligned AI and locks in its values. This is bad, but it’s way better than misaligned AI. ISIS want to turn the world into an idealized 7th Century Middle East, which is a pretty nice place compared to much of human history. There’s still a lot in common with your own values
To be honest I’m just as afraid of aligned AGI as of unaligned AGI. An AGI aligned with the values of the PRC seems like a nightmare. If it’s aligned with the US army it’s only really bad, and Yudkowsky dath illan is not exactly the world I want to live in either...
I don’t agree, because a world of misaligned AI is known to be really bad. Whereas a world of AI successfully aligned by some opposing faction probably has a lot in common with your own values
Extreme case: ISIS successfully builds the first aligned AI and locks in its values. This is bad, but it’s way better than misaligned AI. ISIS want to turn the world into an idealized 7th Century Middle East, which is a pretty nice place compared to much of human history. There’s still a lot in common with your own values