I was going to say that yes, I think there is another kind of thing that can be meaningfully talked about, and “justice” and “mercy” and “duty” have something to do with that sort of thing, but a more prototypical example would be “This court has jurisdiction”. Especially if many experts were of the opinion that it didn’t, but the judge disagreed, but the superior court reversed her, and now the supreme court has decided to hear the case.
But then I realized that there was something different about that kind of “truth”: I would not want an AI to assign a probability to the proposition The court did, in fact, have jurisdiction (nor to, oh, It is the duty of any elected official to tell the public if they learn about a case of corruption, say). I think social constructions can technically be meaningfully talked about among humans, and they are important as hell if you want to understand human communication and behavior, but I guess on reflection I think that the fact that I would want an AI to reason in terms of more basic facts is a hint that if we are discussing epistemology, if we’re discussing what sorts of thingies we can know about and how we can know about them, rather than discussing particular properties of the particularly interesting thingies called humans, then it might be best to say that “The judge wrote in her decision that the court had jurisdiction” is a meaningful statement in the sense under consideration, but “The court had jurisdiction” is not.
I was going to say that yes, I think there is another kind of thing that can be meaningfully talked about, and “justice” and “mercy” and “duty” have something to do with that sort of thing, but a more prototypical example would be “This court has jurisdiction”. Especially if many experts were of the opinion that it didn’t, but the judge disagreed, but the superior court reversed her, and now the supreme court has decided to hear the case.
But then I realized that there was something different about that kind of “truth”: I would not want an AI to assign a probability to the proposition The court did, in fact, have jurisdiction (nor to, oh, It is the duty of any elected official to tell the public if they learn about a case of corruption, say). I think social constructions can technically be meaningfully talked about among humans, and they are important as hell if you want to understand human communication and behavior, but I guess on reflection I think that the fact that I would want an AI to reason in terms of more basic facts is a hint that if we are discussing epistemology, if we’re discussing what sorts of thingies we can know about and how we can know about them, rather than discussing particular properties of the particularly interesting thingies called humans, then it might be best to say that “The judge wrote in her decision that the court had jurisdiction” is a meaningful statement in the sense under consideration, but “The court had jurisdiction” is not.