My problem with CEV is that who you would be if you were smarter and better-informed is extremely path-dependent. Intelligence isn’t a single number, so one can increase different parts of it in different orders. The order people learn things in, and how fully they integrate that knowledge, and what incidental declarative/affective associations they form with the knowledge, can all send the extrapolated person off in different directions. Assuming a CEV-executor would be taking all that into account, and summing over all possible orders (and assuming that this could be somehow made computationally tractable) the extrapolation would get almost nowhere before fanning out uselessly.
OTOH, I suppose that there would be a few well-defined areas of agreement. At the very least, the AI could see current areas of agreement between people. And if implemented correctly, it at least wouldn’t do any harm.
Good point, though I’m not too worried about the path dependency myself; I’m more preoccupied with getting some where “nice and tolerable” than somewhere “perfect”.
My problem with CEV is that who you would be if you were smarter and better-informed is extremely path-dependent. Intelligence isn’t a single number, so one can increase different parts of it in different orders. The order people learn things in, and how fully they integrate that knowledge, and what incidental declarative/affective associations they form with the knowledge, can all send the extrapolated person off in different directions. Assuming a CEV-executor would be taking all that into account, and summing over all possible orders (and assuming that this could be somehow made computationally tractable) the extrapolation would get almost nowhere before fanning out uselessly.
OTOH, I suppose that there would be a few well-defined areas of agreement. At the very least, the AI could see current areas of agreement between people. And if implemented correctly, it at least wouldn’t do any harm.
Good point, though I’m not too worried about the path dependency myself; I’m more preoccupied with getting some where “nice and tolerable” than somewhere “perfect”.