A minimizer will fill the lightcone to make sure there aren’t paperclips elsewhere it can reach. What if other civs are hiding? What if there is undiscovered science which implies natural processes create paperclips somewhere? What if there are “Boltzmann paperclips”? Minimizing means minimizing!
I’m guessing even a Cthulhu minimizer (that wants to reduce the number of Cthulhu in the world) will fill its lightcone with tools for studying its task, even though there is no reasonable chance that it’d need to do anything. It just has nothing better to do, it’s the problem it’s motivated to work on, so it’s what it’ll burn all available resources on.
My speculation here is that it might be that the “what ifs” you describe yield less positive utility than the negative utility due to the chance one of the AI’s descendants starts producing paperclips because “the sign bit flips spontaneously”. Of course the AI will safeguard itself against such events but there are probably physical limits to safety.
the negative utility due to the chance one of the AI’s descendants starts producing paperclips because “the sign bit flips spontaneously”
It’s hard to make such estimates, as they require that an AGI is unable to come up with an AGI design that’s less likely than empty space to produce paperclips. I don’t see how the impossibility of this task could be guaranteed on low level, as a “physical law”; and if you merely don’t see how to do it, an AGI might still find a way, as it’s better at designing things than you are. Empty space is only status quo, it’s not obviously optimal at not producing paperclips, and so it might be possible to find a better plan, which becomes more likely if you are very good at finding better plans.
If you mean “empty space” as in vacuum then I think it doesn’t contain any paperclips more or less by definition. If you mean “empty space” as in thermodynamic equilibrium at finite temperature then it contains some small amount of paperclips. I agree it might be possible to create a state which contains less paperclips for some limited period of time (before onset of thermodynamic equilibrium). However it’s probably much harder than the opposite (i.e. creating a state which contains much more paperclips than thermodynamic equilibrium).
A minimizer will fill the lightcone to make sure there aren’t paperclips elsewhere it can reach. What if other civs are hiding? What if there is undiscovered science which implies natural processes create paperclips somewhere? What if there are “Boltzmann paperclips”? Minimizing means minimizing!
I’m guessing even a Cthulhu minimizer (that wants to reduce the number of Cthulhu in the world) will fill its lightcone with tools for studying its task, even though there is no reasonable chance that it’d need to do anything. It just has nothing better to do, it’s the problem it’s motivated to work on, so it’s what it’ll burn all available resources on.
My speculation here is that it might be that the “what ifs” you describe yield less positive utility than the negative utility due to the chance one of the AI’s descendants starts producing paperclips because “the sign bit flips spontaneously”. Of course the AI will safeguard itself against such events but there are probably physical limits to safety.
It’s hard to make such estimates, as they require that an AGI is unable to come up with an AGI design that’s less likely than empty space to produce paperclips. I don’t see how the impossibility of this task could be guaranteed on low level, as a “physical law”; and if you merely don’t see how to do it, an AGI might still find a way, as it’s better at designing things than you are. Empty space is only status quo, it’s not obviously optimal at not producing paperclips, and so it might be possible to find a better plan, which becomes more likely if you are very good at finding better plans.
If you mean “empty space” as in vacuum then I think it doesn’t contain any paperclips more or less by definition. If you mean “empty space” as in thermodynamic equilibrium at finite temperature then it contains some small amount of paperclips. I agree it might be possible to create a state which contains less paperclips for some limited period of time (before onset of thermodynamic equilibrium). However it’s probably much harder than the opposite (i.e. creating a state which contains much more paperclips than thermodynamic equilibrium).
It is not clear to me that the definition of the vacuum state (http://en.wikipedia.org/wiki/Vacuum_state) precludes the momentary creation of paperclips.