I think this is a valid point, but figuring out which side’s ontology is “more accurate” is a different topic that just isn’t what original essay was about.
I guess my point is that all ontologies are not “fundamentally correct” in the sense that your worldview of concepts exists as merely an abstraction layer over reality. But it could definitely be the case that certain ontologies are “more accurate”, in that they have nicer mappings onto reality, or satisfy other properties which make them nicer to handle. In which case, you might find it both instrumentally / epistemically useful to try and convince others to adopt such ontologies (and depart from the ones they are using, hence providing the need for the techniques above).
(But that of course would require you to demonstrate the nicer properties of your ontology, etc. which is a different topic.)
I think this is a valid point, but figuring out which side’s ontology is “more accurate” is a different topic that just isn’t what original essay was about.
I guess my point is that all ontologies are not “fundamentally correct” in the sense that your worldview of concepts exists as merely an abstraction layer over reality. But it could definitely be the case that certain ontologies are “more accurate”, in that they have nicer mappings onto reality, or satisfy other properties which make them nicer to handle. In which case, you might find it both instrumentally / epistemically useful to try and convince others to adopt such ontologies (and depart from the ones they are using, hence providing the need for the techniques above).
(But that of course would require you to demonstrate the nicer properties of your ontology, etc. which is a different topic.)