It would be a message customized deliberately for each human, and worked on gradually over years of subtle convincing arguments. That’s how I understand the hypothetical.
I think that an AI competent enough to manage this would have faster easier ways to accomplish the same effect, but I do agree that this would quite likely work.
If an information channel is only used to transmit information that is of negative expected value to the receiver, the selection pressure incentivizes the receiver to ignore that information channel.
That is to say, an AI which makes the most convincing-sounding argument for not reproducing to everyone will select for those people who ignore convincing-sounding arguments when choosing whether to engage in behaviors that lead to reproduction.
Yeah, but… Selection effects, in an evolutionary sense, are relevant over multiple generations. The time scale of the effects we’re thinking about are less than the time scale of a single generation.
This is less of a “magic attack that destroys everyone” and more of one of a thousand cuts which collectively bring down society.
Some people get affected by arguments, others by distracting entertainment, others by nootropics that work well but stealthily have permanent impacts on fertility, some get caught up in terrorist attacks by weird AI-led cults.… Just, a bunch of stuff from a bunch of angles.
Yeah, my argument was “this particular method of causing actual human extinction would not work” not “causing human extinction is not possible”, with a side of “agents learn to ignore adversarial input channels and this dynamic is frequently important”.
Yeah. I don’t actually think that a persuasive argument targeted to every single human is an efficient way for a superintelligent AI to accomplish its goals in the world. Someone else mentioned convincing the most gullible humans to hurt the wary humans. If the AI’s goal was to inhibit human reproduction, it would be simple to create a bioweapon to cause sterility without killing the victims. Doesn’t take very many loyally persuaded humans to be the hands for a mission like that.
It would be a message customized deliberately for each human, and worked on gradually over years of subtle convincing arguments. That’s how I understand the hypothetical.
I think that an AI competent enough to manage this would have faster easier ways to accomplish the same effect, but I do agree that this would quite likely work.
If an information channel is only used to transmit information that is of negative expected value to the receiver, the selection pressure incentivizes the receiver to ignore that information channel.
That is to say, an AI which makes the most convincing-sounding argument for not reproducing to everyone will select for those people who ignore convincing-sounding arguments when choosing whether to engage in behaviors that lead to reproduction.
Yeah, but… Selection effects, in an evolutionary sense, are relevant over multiple generations. The time scale of the effects we’re thinking about are less than the time scale of a single generation. This is less of a “magic attack that destroys everyone” and more of one of a thousand cuts which collectively bring down society. Some people get affected by arguments, others by distracting entertainment, others by nootropics that work well but stealthily have permanent impacts on fertility, some get caught up in terrorist attacks by weird AI-led cults.… Just, a bunch of stuff from a bunch of angles.
Yeah, my argument was “this particular method of causing actual human extinction would not work” not “causing human extinction is not possible”, with a side of “agents learn to ignore adversarial input channels and this dynamic is frequently important”.
Yeah. I don’t actually think that a persuasive argument targeted to every single human is an efficient way for a superintelligent AI to accomplish its goals in the world. Someone else mentioned convincing the most gullible humans to hurt the wary humans. If the AI’s goal was to inhibit human reproduction, it would be simple to create a bioweapon to cause sterility without killing the victims. Doesn’t take very many loyally persuaded humans to be the hands for a mission like that.