In reading this I was reminded (or at least my mind sort of wondered of to the thought) of a statement a psychologist friend of mine made many years ago. He described his job as trying to understand the rationality underlying the people we considered “not normal”.
In other words, understanding the operational map the person has. The goal was not really to verify where that map and the territory were not in sync (at least through the lens of his map—or other generally acceptable maps) or even to compare it with his map. I think the trivial implication here may be that every mind can be considered an “alien” mind. It’s just that for the most part human minds are not very alien from one another.
But what also comes from that is that things like intelligence and rationality are not part of the criteria. So that seems to suggest that we can actually attempt (and I am 99.999% sure some actually are) understanding the alien minds of other species on Earth.
Both implications see to suggest we can look to those areas, human psychology and research by those exploring the “minds” of other species for what types of assumptions are needed/made. We could then look at ways to assess the success (would predicting future actions under defined conditions indicate some understanding of the mind?) and which assumptions (or classes of assumptions?) matter.
That might then inform on the assumptions needed for AI.
In reading this I was reminded (or at least my mind sort of wondered of to the thought) of a statement a psychologist friend of mine made many years ago. He described his job as trying to understand the rationality underlying the people we considered “not normal”.
In other words, understanding the operational map the person has. The goal was not really to verify where that map and the territory were not in sync (at least through the lens of his map—or other generally acceptable maps) or even to compare it with his map. I think the trivial implication here may be that every mind can be considered an “alien” mind. It’s just that for the most part human minds are not very alien from one another.
But what also comes from that is that things like intelligence and rationality are not part of the criteria. So that seems to suggest that we can actually attempt (and I am 99.999% sure some actually are) understanding the alien minds of other species on Earth.
Both implications see to suggest we can look to those areas, human psychology and research by those exploring the “minds” of other species for what types of assumptions are needed/made. We could then look at ways to assess the success (would predicting future actions under defined conditions indicate some understanding of the mind?) and which assumptions (or classes of assumptions?) matter.
That might then inform on the assumptions needed for AI.