You are making partially sense in that you are pointing to a modelling style but it does leave me unsure whether I can correctly fill in the missing bits. My thinking is interferred a lot by “getter functions” that are of the form [code]function get_color(self){ return self.color }. One of the point of such is that attributes tend to be private but methods are public and the programmer should he need to do so could change the implementation details without messing outside customers. So the modeeling style shares a similarity that objects are allowed to secretly have details outside of their interface. Sure if we have verbs and objects mixed up but can express object-like things as verbs by converting objects to verbs we only have to care about one basic ontology type. But I am unsure whether I missed it or is it fortcoming why it is important or valuable to focus on the verbs.
I am unsure what rule-basedness is but if it is different from extensional conception of functions then I would be super intrigued. I can get that sensing should be modelled with functions in that way but it seems contradictory how functions were supposed to be prediction or evolution models. So if I have a(b(c(d))) does it mean that first d goes throught two kinds of evolutions and is then observed or d goes throught one kind of evolution and then observation of that is observed. I am expecting this kind of divison is not an actual problem but I can’t effortlessly go from the function formalism to observation/prediction formalisms and would likely make a lot of errors there.
My next series of posts will be directly about the Yoneda lemma, which basically tells us that everything you could want to know about an object is contained in the morphisms going into/out of the object. Moreover, we get this knowledge in a “natural” way that makes life really easy. It’s a pretty cool theorem.
You are making partially sense in that you are pointing to a modelling style but it does leave me unsure whether I can correctly fill in the missing bits. My thinking is interferred a lot by “getter functions” that are of the form [code]function get_color(self){ return self.color }. One of the point of such is that attributes tend to be private but methods are public and the programmer should he need to do so could change the implementation details without messing outside customers. So the modeeling style shares a similarity that objects are allowed to secretly have details outside of their interface. Sure if we have verbs and objects mixed up but can express object-like things as verbs by converting objects to verbs we only have to care about one basic ontology type. But I am unsure whether I missed it or is it fortcoming why it is important or valuable to focus on the verbs.
I am unsure what rule-basedness is but if it is different from extensional conception of functions then I would be super intrigued. I can get that sensing should be modelled with functions in that way but it seems contradictory how functions were supposed to be prediction or evolution models. So if I have a(b(c(d))) does it mean that first d goes throught two kinds of evolutions and is then observed or d goes throught one kind of evolution and then observation of that is observed. I am expecting this kind of divison is not an actual problem but I can’t effortlessly go from the function formalism to observation/prediction formalisms and would likely make a lot of errors there.
My next series of posts will be directly about the Yoneda lemma, which basically tells us that everything you could want to know about an object is contained in the morphisms going into/out of the object. Moreover, we get this knowledge in a “natural” way that makes life really easy. It’s a pretty cool theorem.