A bit has two possible values, 0 and 1, it doesn’t have “destroyed” value.
A physical bit does. Remember that we are talking about an actual bit stored inside a memory location on the computer (say, a capacitor in a DRAM cell).
And, of course, it’s not enough to prefer intact laptop over one wholly pulverised into plasma
Why not? Not recieving any future reward is such a huge negative utility that it would take a very large positive utility to carry out an action that would risk that occuring. Would you allow a surgeon to remove some section of your brain for $1,000,000 even if you knew that that section would not affect your reward pathways?
Would you allow a surgeon to remove some section of your brain for $1,000,000 even if you knew that that section would not affect your reward pathways?
If I had brain cancer or cerebral AVM or the like, I’d pay to have it removed. See my edit. The root issue is that in AIXI’s model, potential actions (that it iterates through) are not represented as output of some hardware, but are forced onto the model. Consequently the hardware that actually outputs those in the real world is not represented as critical. And it is connected in parallel onto the same power supply (as the understood-to-be-critically-important hardware which relays the actions). It literally thinks it got a brain parasite. Of course it won’t necessarily drop an anvil at the whole thing just because of experimenting—that’s patently stupid. It will surgically excise some parts with great caution.
A physical bit does. Remember that we are talking about an actual bit stored inside a memory location on the computer (say, a capacitor in a DRAM cell).
Why not? Not recieving any future reward is such a huge negative utility that it would take a very large positive utility to carry out an action that would risk that occuring. Would you allow a surgeon to remove some section of your brain for $1,000,000 even if you knew that that section would not affect your reward pathways?
If I had brain cancer or cerebral AVM or the like, I’d pay to have it removed. See my edit. The root issue is that in AIXI’s model, potential actions (that it iterates through) are not represented as output of some hardware, but are forced onto the model. Consequently the hardware that actually outputs those in the real world is not represented as critical. And it is connected in parallel onto the same power supply (as the understood-to-be-critically-important hardware which relays the actions). It literally thinks it got a brain parasite. Of course it won’t necessarily drop an anvil at the whole thing just because of experimenting—that’s patently stupid. It will surgically excise some parts with great caution.