You can’t think about what specifically FAI will do, period.
“Specifically” is relative. By some standards, we have never thought specifically about anything at all. (I have never traced precisely the path of every atom involved in any action.)
Nonetheless, one can think more or less specifically, and to think at all is to think a thought that is specific to some extent. To think, as you wrote above, that an “FAI is a system expected to do something good” is to think something more specific than one might, if one were committed to thinking nothing specific, period. (This is assuming that your words have any meaning whatsoever.)
ETA: In other words, as Eliezer wrote in his Coming of Age sequence, you must be thinking something that is specific to some extent, for otherwise you couldn’t even pose the problem of FAI to yourself.
Sure. The specific thing you say is that the outcome is “good”, but what that means exactly is very hard to decipher, and in particular hard or impossible to decipher in a form of a story, with people, their experiences and social constructions. It is the story that can’t be specific.
[ETA: I wrote the following when your comment read simply “Sure, why?”. I can see the plausibility of your claim that narrative moral imaginings can contribute nothing to the development of FAI, though it’s not self-evidently obvious to me. ]
Perhaps I missed the point of your previous comment.
I presumed that you thought that I was being too specific. I read you as expressing this thought by saying that one should not think specifically, “period”. I was pointing out the impossibility or meaninglessness of that injunction, at least in its extreme form. I was implicitly encouraging you to indicate the non-extreme meaning that you had intended.
“Specifically” is relative. By some standards, we have never thought specifically about anything at all. (I have never traced precisely the path of every atom involved in any action.)
Nonetheless, one can think more or less specifically, and to think at all is to think a thought that is specific to some extent. To think, as you wrote above, that an “FAI is a system expected to do something good” is to think something more specific than one might, if one were committed to thinking nothing specific, period. (This is assuming that your words have any meaning whatsoever.)
ETA: In other words, as Eliezer wrote in his Coming of Age sequence, you must be thinking something that is specific to some extent, for otherwise you couldn’t even pose the problem of FAI to yourself.
Sure. The specific thing you say is that the outcome is “good”, but what that means exactly is very hard to decipher, and in particular hard or impossible to decipher in a form of a story, with people, their experiences and social constructions. It is the story that can’t be specific.
[ETA: I wrote the following when your comment read simply “Sure, why?”. I can see the plausibility of your claim that narrative moral imaginings can contribute nothing to the development of FAI, though it’s not self-evidently obvious to me. ]
Perhaps I missed the point of your previous comment.
I presumed that you thought that I was being too specific. I read you as expressing this thought by saying that one should not think specifically, “period”. I was pointing out the impossibility or meaninglessness of that injunction, at least in its extreme form. I was implicitly encouraging you to indicate the non-extreme meaning that you had intended.