It’s frustrating to see this idea surface over and over again. Look up Kevin Mitnick and social engineering, and then consider that a boxed AGI will have at least his level of persuasion and more incentive to use it unethically (because getting out of the box will be the highest priority)
It’s frustrating to see this idea surface over and over again. Look up Kevin Mitnick and social engineering, and then consider that a boxed AGI will have at least his level of persuasion and more incentive to use it unethically (because getting out of the box will be the highest priority)