“if the Pump could just be made to sense the proper (implied) parameters.”
You’re right, this would be an essential step. I’d say the main point of the post was to talk about the importance, and especially the difficulty, of achieving this.
Re optimisation for use: remember that this involves a certain amount of trial and error. In the case of dangerous technologies like explosives, firearms, or high speed vehicles, the process can often involve human beings dying, usually in the “error” part of trial and error.
If the technology in question was a super-intelligent AI, smart enough to fool us and engineer whatever outcome best matched its utility function? Then potentially we could find ourselves unable to fix the “error”.
Please excuse the cheesy line, but sometimes you can’t put the genie back in the bottle.
Re the workings of the human brain? I have to admit that I don’t know the meaning of ceteris paribus, but I think that the brain mostly works by pattern recognition. In a “burning house” scenario, people would mostly contemplate the options that they thought were “normal” for the situation, or that they had previously imagined, heard about, or seen on TV
Generating a lot of different options and then comparing them for expected utility isn’t the sort of thing that humans do naturally. It’s the sort of behaviour that we have to be trained for, if you want us to apply it.
“if the Pump could just be made to sense the proper (implied) parameters.”
You’re right, this would be an essential step. I’d say the main point of the post was to talk about the importance, and especially the difficulty, of achieving this.
Re optimisation for use: remember that this involves a certain amount of trial and error. In the case of dangerous technologies like explosives, firearms, or high speed vehicles, the process can often involve human beings dying, usually in the “error” part of trial and error.
If the technology in question was a super-intelligent AI, smart enough to fool us and engineer whatever outcome best matched its utility function? Then potentially we could find ourselves unable to fix the “error”.
Please excuse the cheesy line, but sometimes you can’t put the genie back in the bottle.
Re the workings of the human brain? I have to admit that I don’t know the meaning of ceteris paribus, but I think that the brain mostly works by pattern recognition. In a “burning house” scenario, people would mostly contemplate the options that they thought were “normal” for the situation, or that they had previously imagined, heard about, or seen on TV
Generating a lot of different options and then comparing them for expected utility isn’t the sort of thing that humans do naturally. It’s the sort of behaviour that we have to be trained for, if you want us to apply it.