No, but it’s quite an interesting question. Evolution does go in for sticks as well as carrots, even though punishment has non-obvious costs among humans.
When I made my comment, I hadn’t read the interview. I’m not sure about Eliezer’s worst case scenario from lack of boredom—it requires that there be a best moment which the AI would keep repeating if it weren’t prevented by boredom. Is there potentially a best moment to tile the universe with? Could an AI be sure it had found the best moment?
The sticks are for things that are worse than sitting there doing nothing.
I figure boredom is like that—it has to work at the hedonic baseline—so it has to be a stick.
Is there potentially a best moment to tile the universe with? Could an AI be sure it had found the best moment?
Are its goals to find the “best moment” in the first place? It seems impossible to answer such questions without reference to some kind of moral system.
So, if you lost the human notion of boredom and curiosity, but you preserve all the rest of human values, then it would be like… Imagine the AI that has everything but boredom. It goes out to the stars, takes apart the stars for raw materials, and it builds whole civilizations full of minds experiencing the most exciting thing ever, over and over and over and over and over again.
The whole universe is just tiled with that, and that single moment is something that we would find this very worthwhile and exciting to happen once. But it lost the single aspect of value that we would name boredom and went instead to the more pure math of exploration-exploitation where you spend some initial resources finding the best possible moment to live in and you devote the rest of your resources to exploiting that one moment over and over again.
So it’s “most exciting moment”, not “best moment”.
Even that might imply a moral system. Why most exciting rather than happiest or most content or most heroic?
Your “The sticks are for things that are worse than sitting there doing nothing.” still might mean that boredom could be problematic for an FAI.
It’s at least possible that meeting the true standards of Friendliness (whatever they are) isn’t difficult for a sufficiently advanced FAI. The human race and its descendants has the mixture of safety and change that suits them as well as such a thing can be done.
The FAI is BORED! We can hope that its Friendliness is a strong enough drive that it won’t make its life more interesting by messing with humans. Maybe it will withdraw some resources from satisfying people to do something more challenging. How is this feasible if everything within reach is devoted to Friendly goals?
The AI could define self-care as part of Friendliness, even as very generous people acknowledge that they need rest and refreshment.
I’m beginning to see the FAI as potentially a crazy cat-lady, but perhaps creating/maintaining only as many people as it can take care of properly is one of the easier problems.
So it’s “most exciting moment”, not “best moment”.
I suspect that’s an overstatement. Presumably it would be “most valuable moment,” where the value of a moment is determined by all the other axes of value except novelty.
No, but it’s quite an interesting question. Evolution does go in for sticks as well as carrots, even though punishment has non-obvious costs among humans.
When I made my comment, I hadn’t read the interview. I’m not sure about Eliezer’s worst case scenario from lack of boredom—it requires that there be a best moment which the AI would keep repeating if it weren’t prevented by boredom. Is there potentially a best moment to tile the universe with? Could an AI be sure it had found the best moment?
The sticks are for things that are worse than sitting there doing nothing.
I figure boredom is like that—it has to work at the hedonic baseline—so it has to be a stick.
Are its goals to find the “best moment” in the first place? It seems impossible to answer such questions without reference to some kind of moral system.
My mistake—here’s the original:
So it’s “most exciting moment”, not “best moment”.
Even that might imply a moral system. Why most exciting rather than happiest or most content or most heroic?
Your “The sticks are for things that are worse than sitting there doing nothing.” still might mean that boredom could be problematic for an FAI.
It’s at least possible that meeting the true standards of Friendliness (whatever they are) isn’t difficult for a sufficiently advanced FAI. The human race and its descendants has the mixture of safety and change that suits them as well as such a thing can be done.
The FAI is BORED! We can hope that its Friendliness is a strong enough drive that it won’t make its life more interesting by messing with humans. Maybe it will withdraw some resources from satisfying people to do something more challenging. How is this feasible if everything within reach is devoted to Friendly goals?
The AI could define self-care as part of Friendliness, even as very generous people acknowledge that they need rest and refreshment.
I’m beginning to see the FAI as potentially a crazy cat-lady, but perhaps creating/maintaining only as many people as it can take care of properly is one of the easier problems.
I suspect that’s an overstatement. Presumably it would be “most valuable moment,” where the value of a moment is determined by all the other axes of value except novelty.