When we take Prozac, we are following our wetware commands to take Prozac. Similarly, when an AI reprograms itself, it does so according to its current programming. You could say that it goes beyond its original programming, in that it after it follows it it has new, better programming, but it’s not as if it has some kind of free will that lets it ignore what it was programmed to do.
When a computer really breaks its programming, and quantum randomness results in what should be a 0 being read as a 1 or vice versa, the result isn’t intelligence. The most likely result is the computer crashing.
When we take Prozac, we are following our wetware commands to take Prozac. Similarly, when an AI reprograms itself, it does so according to its current programming. You could say that it goes beyond its original programming, in that it after it follows it it has new, better programming, but it’s not as if it has some kind of free will that lets it ignore what it was programmed to do.
When a computer really breaks its programming, and quantum randomness results in what should be a 0 being read as a 1 or vice versa, the result isn’t intelligence. The most likely result is the computer crashing.