To my unnderstanding sets are defined by their members. The part about there having ot be some rule sounds very starng. Sure you need to somehow express what the members are. Inparticular if you have two rules but you end up picking the same members then you have defined only 1 set not 2. The rule is not part of the make up of the set. It is confusing because it seems a lot of real deduction uses the “intent” of setting up the set to deduce what it does or does not contain. But that kind of deduction could be carried out without reference to memberhips. If sets need to have rules, how do the composition functions obtain their rule?
There is also a big difference between a set containing the word “dog” and a set containing a dog. And while in the start it seemed that “dogs lose all properties” iam guessing that by the end we have things like legs(dog)=4 which does move all the interesting stuff to the morphisms but the labels are not arbitrary. In order to set up the correct morphisms I would need to know a whole lot about dogs.
It was also supposed to be that objects are nouns and morphisms are verbs. It is wierd to think that legs(x) is a verb or that legs turns dogs into numbers (in a sense of we predict this dog will spontaneusly combust into number 4)
In the end, we don’t really care about sets at all. They’re just bags with stuff in them. Who cares about bags? But we do care about functions—we want those to be rule-based. We need functions to go “from” somewhere and “to” somewhere. Let’s call those things sets. Then we need these “sets” to be rule-based.
I’m grateful for your comments. They’re very useful, and you raise good points. I’ve got most of a post already about how functions give meaning to the elements of sets. As for how functions is a verb, think of properties as existing in verbs. So to know something, you need to observe it in some way, which means it has to affect your sensory devices, such as your ears, eyes, thermometers, whatever. You know dogs, for example, by the way they bark, by the way they lick, they way they look, etc. So properties exist in the verbs. “Legs” are a noun, but all of your knowledge about them has to come from verbs. Does that make sense?
You are making partially sense in that you are pointing to a modelling style but it does leave me unsure whether I can correctly fill in the missing bits. My thinking is interferred a lot by “getter functions” that are of the form [code]function get_color(self){ return self.color }. One of the point of such is that attributes tend to be private but methods are public and the programmer should he need to do so could change the implementation details without messing outside customers. So the modeeling style shares a similarity that objects are allowed to secretly have details outside of their interface. Sure if we have verbs and objects mixed up but can express object-like things as verbs by converting objects to verbs we only have to care about one basic ontology type. But I am unsure whether I missed it or is it fortcoming why it is important or valuable to focus on the verbs.
I am unsure what rule-basedness is but if it is different from extensional conception of functions then I would be super intrigued. I can get that sensing should be modelled with functions in that way but it seems contradictory how functions were supposed to be prediction or evolution models. So if I have a(b(c(d))) does it mean that first d goes throught two kinds of evolutions and is then observed or d goes throught one kind of evolution and then observation of that is observed. I am expecting this kind of divison is not an actual problem but I can’t effortlessly go from the function formalism to observation/prediction formalisms and would likely make a lot of errors there.
My next series of posts will be directly about the Yoneda lemma, which basically tells us that everything you could want to know about an object is contained in the morphisms going into/out of the object. Moreover, we get this knowledge in a “natural” way that makes life really easy. It’s a pretty cool theorem.
To my unnderstanding sets are defined by their members. The part about there having ot be some rule sounds very starng. Sure you need to somehow express what the members are. Inparticular if you have two rules but you end up picking the same members then you have defined only 1 set not 2. The rule is not part of the make up of the set. It is confusing because it seems a lot of real deduction uses the “intent” of setting up the set to deduce what it does or does not contain. But that kind of deduction could be carried out without reference to memberhips. If sets need to have rules, how do the composition functions obtain their rule?
There is also a big difference between a set containing the word “dog” and a set containing a dog. And while in the start it seemed that “dogs lose all properties” iam guessing that by the end we have things like legs(dog)=4 which does move all the interesting stuff to the morphisms but the labels are not arbitrary. In order to set up the correct morphisms I would need to know a whole lot about dogs.
It was also supposed to be that objects are nouns and morphisms are verbs. It is wierd to think that legs(x) is a verb or that legs turns dogs into numbers (in a sense of we predict this dog will spontaneusly combust into number 4)
In the end, we don’t really care about sets at all. They’re just bags with stuff in them. Who cares about bags? But we do care about functions—we want those to be rule-based. We need functions to go “from” somewhere and “to” somewhere. Let’s call those things sets. Then we need these “sets” to be rule-based.
I’m grateful for your comments. They’re very useful, and you raise good points. I’ve got most of a post already about how functions give meaning to the elements of sets. As for how functions is a verb, think of properties as existing in verbs. So to know something, you need to observe it in some way, which means it has to affect your sensory devices, such as your ears, eyes, thermometers, whatever. You know dogs, for example, by the way they bark, by the way they lick, they way they look, etc. So properties exist in the verbs. “Legs” are a noun, but all of your knowledge about them has to come from verbs. Does that make sense?
You are making partially sense in that you are pointing to a modelling style but it does leave me unsure whether I can correctly fill in the missing bits. My thinking is interferred a lot by “getter functions” that are of the form [code]function get_color(self){ return self.color }. One of the point of such is that attributes tend to be private but methods are public and the programmer should he need to do so could change the implementation details without messing outside customers. So the modeeling style shares a similarity that objects are allowed to secretly have details outside of their interface. Sure if we have verbs and objects mixed up but can express object-like things as verbs by converting objects to verbs we only have to care about one basic ontology type. But I am unsure whether I missed it or is it fortcoming why it is important or valuable to focus on the verbs.
I am unsure what rule-basedness is but if it is different from extensional conception of functions then I would be super intrigued. I can get that sensing should be modelled with functions in that way but it seems contradictory how functions were supposed to be prediction or evolution models. So if I have a(b(c(d))) does it mean that first d goes throught two kinds of evolutions and is then observed or d goes throught one kind of evolution and then observation of that is observed. I am expecting this kind of divison is not an actual problem but I can’t effortlessly go from the function formalism to observation/prediction formalisms and would likely make a lot of errors there.
My next series of posts will be directly about the Yoneda lemma, which basically tells us that everything you could want to know about an object is contained in the morphisms going into/out of the object. Moreover, we get this knowledge in a “natural” way that makes life really easy. It’s a pretty cool theorem.