The problem with these particular extensions is that they don’t sound plausible for this type of AI. In my opinion it would be easier when talking with designers to switch from this example to a slightly more sci-fi example.
The leap is between the obvious “it’s ‘manipulating’ its editors by recognizing simple patterns in their behavior” to “it’s manipulating its editors by correctly interpreting the causes underlying their behavior.”
Much easier to extend in the other direction first: “Now imagine that it’s not an article-writer, but a science officer aboard the commercial spacecraft Nostromo...”
The problem with these particular extensions is that they don’t sound plausible for this type of AI. In my opinion it would be easier when talking with designers to switch from this example to a slightly more sci-fi example.
The leap is between the obvious “it’s ‘manipulating’ its editors by recognizing simple patterns in their behavior” to “it’s manipulating its editors by correctly interpreting the causes underlying their behavior.”
Much easier to extend in the other direction first: “Now imagine that it’s not an article-writer, but a science officer aboard the commercial spacecraft Nostromo...”
Upvoted for remembering that Ash was the science officer and not just the movie’s token android.