Obvious problem 1: the video output or descriptions can contain basilisks, including ones that cause problem 2.
Obvious problem 2: Someone could ask and then verify it to make full UFAI without realizing it.
Obvious problem 3: UFAI could arise inside the simulation used to produce the hypotheticals, and either hack it’s way out directly or cause problem 1 followed by problem 2.
And most oblivious problem of all: Withe being able to repeatedly do the highly dangerous and active step of modifying it’s own source code, it’ll never get smart enough to be useful on 95% of queries.
Fair point. in that case, given an unknown partially complete AI, if the first action you take is “Let me just start reading the contents of these files without running it to see what it even does.” then someone could say “A UFAI put a basilisk in the source code and used it to kill all of humanity, you lose.”
Obvious problem 1: the video output or descriptions can contain basilisks, including ones that cause problem 2.
Obvious problem 2: Someone could ask and then verify it to make full UFAI without realizing it.
Obvious problem 3: UFAI could arise inside the simulation used to produce the hypotheticals, and either hack it’s way out directly or cause problem 1 followed by problem 2.
And most oblivious problem of all: Withe being able to repeatedly do the highly dangerous and active step of modifying it’s own source code, it’ll never get smart enough to be useful on 95% of queries.
Fair point. in that case, given an unknown partially complete AI, if the first action you take is “Let me just start reading the contents of these files without running it to see what it even does.” then someone could say “A UFAI put a basilisk in the source code and used it to kill all of humanity, you lose.”
That isn’t even entirely without precedent, using this as an example: http://boingboing.net/2012/07/10/dropped-infected-usb-in-the-co.html Sometimes malicious code really is literally left physically lying around, waiting for someone to pop it into a computer out of curiosity.