Well, this would be an example of one of the projects that I think may teach us something. But if you are speaking of “an ontology,” rather than just “ontology,” you may be talking about some theory of relativized ontologies, but more likely you’re not speaking about ontology in the same way as those who prioritize it over epistemology. Those who make epistemology primary still talk about things, they just disagree with the ontologists about complicated aspects of our relationship to the things and what our talk about the things means.
you may be talking about some theory of relativized ontologies, but more likely you’re not speaking about ontology in the same way as those who prioritize it over epistemology.
I’m not sure. Barry Smith who leads Basic Formal Ontology which get’s used for medical informatics writes in his “Against Fantology”-paper sentences like:
It underlies the logical atomism of Bertrand Russell, including
the central thesis according to which all form is logical form – a thesis which, be
it noted, leaves no room for a discipline of formal ontology as something separate
from formal logic.
Baysianism as described by Yvain as described in seems a bit like what Barry Smith describes as spreadsheet ontology with probability values instead of logical true false values.
Even if ontological questions can’t be setteled in a way to decide which ontology is more correct than another, it seems to me that you have to decide for one ontology to use for your AGI. Different choices of how you structure that ontology will have a substantial effect on the way the AGI reasons.
Cyc calls itself an ontology. Doesn’t any AI need such an ontology to reason about the world?
Well, this would be an example of one of the projects that I think may teach us something. But if you are speaking of “an ontology,” rather than just “ontology,” you may be talking about some theory of relativized ontologies, but more likely you’re not speaking about ontology in the same way as those who prioritize it over epistemology. Those who make epistemology primary still talk about things, they just disagree with the ontologists about complicated aspects of our relationship to the things and what our talk about the things means.
I’m not sure. Barry Smith who leads Basic Formal Ontology which get’s used for medical informatics writes in his “Against Fantology”-paper sentences like:
Baysianism as described by Yvain as described in seems a bit like what Barry Smith describes as spreadsheet ontology with probability values instead of logical true false values.
Even if ontological questions can’t be setteled in a way to decide which ontology is more correct than another, it seems to me that you have to decide for one ontology to use for your AGI. Different choices of how you structure that ontology will have a substantial effect on the way the AGI reasons.