That’s one sneaky parable—seems to point in a number of interesting directions and has enough emotional hooks (like feeling superior to the Pebble Sorters) to be distracting.
I’m taking it to mean that people can spend a lot of effort on approximating strongly felt patterns before those patterns are abstracted enough to be understood.
What would happen if a Pebble Sorter came to understand primes? I’m guessing that a lot of them would feel as though the bottom was falling out of their civilization and there was no point to life.
And yes, if you try to limit the a mind that’s more intelligent than your own, you aren’t going to get good results. For that matter, your mind is probably more intelligent than your abstraction of your mind.[1]
It sounds as though an FAI needs some way to engage with the universe which isn’t completely mediated by humans.
We can hope we’re smarter than the Pebble Sorters, but if we’ve got blind spots of comparable magnitude, we are by definition not seeing them.
[1]On the other hand, if you have problems with depression, there are trains of thought which are better not to follow.
What would happen if a Pebble Sorter came to understand primes? I’m guessing that a lot of them would feel as though the bottom was falling out of their civilization and there was no point to life.
Really? I think they would think it is an amazing revelation. They don’t need to fight about heap-correctness, anymore, they can just calculate heap-correctness.
Remember, the meaning of the pebblesorting way of life is to construct correct heaps, not to figure out which heaps are correct.
That’s their purported terminal value, yes. But if we just had magical boxes that we could put problems into and get the solutions… we’d end up feeling quite listless. (Well, until somebody noticed that you could munchkin that to solve all of the problems ever and build utopia, perhaps even eutopia if you could phrase your question right, because these boxes are capable of violating the Second Law of Thermodynamics and more – but that doesn’t apply in the Pebblesorters’ case.)
Unlike them, our terminal value seems to include seeking the feeling that we’re personally contributing. (A magic box that understood our terminal values and would tell us how to solve our problems in order to maximize our values would probably phrase its answer with some open parts in a way that still made us feel like we had agency in executing the answer.)
That’s one sneaky parable—seems to point in a number of interesting directions and has enough emotional hooks (like feeling superior to the Pebble Sorters) to be distracting.
I’m taking it to mean that people can spend a lot of effort on approximating strongly felt patterns before those patterns are abstracted enough to be understood.
What would happen if a Pebble Sorter came to understand primes? I’m guessing that a lot of them would feel as though the bottom was falling out of their civilization and there was no point to life.
And yes, if you try to limit the a mind that’s more intelligent than your own, you aren’t going to get good results. For that matter, your mind is probably more intelligent than your abstraction of your mind.[1]
It sounds as though an FAI needs some way to engage with the universe which isn’t completely mediated by humans.
We can hope we’re smarter than the Pebble Sorters, but if we’ve got blind spots of comparable magnitude, we are by definition not seeing them.
[1]On the other hand, if you have problems with depression, there are trains of thought which are better not to follow.
Really? I think they would think it is an amazing revelation. They don’t need to fight about heap-correctness, anymore, they can just calculate heap-correctness.
Remember, the meaning of the pebblesorting way of life is to construct correct heaps, not to figure out which heaps are correct.
That’s their purported terminal value, yes. But if we just had magical boxes that we could put problems into and get the solutions… we’d end up feeling quite listless. (Well, until somebody noticed that you could munchkin that to solve all of the problems ever and build utopia, perhaps even eutopia if you could phrase your question right, because these boxes are capable of violating the Second Law of Thermodynamics and more – but that doesn’t apply in the Pebblesorters’ case.)
Unlike them, our terminal value seems to include seeking the feeling that we’re personally contributing. (A magic box that understood our terminal values and would tell us how to solve our problems in order to maximize our values would probably phrase its answer with some open parts in a way that still made us feel like we had agency in executing the answer.)