I am using counting to refer to any process by which a number is assigned as a symbol for a property.
I use + in both its concrete—this field now contains two sheep—and its abstract—this set now contains two sheep—meanings. I hope the context makes it clear when I am using each meaning, and why; because the lack of clarity is, in fact, important. See my response to the first comment, in which I deliberately used its concrete meaning in response to somebody using the abstract meaning. Both of us are in fact correct, and the confusion itself is meaningful.
Because it -is- a linguistic problem—and because linguistic problems can, in fact, be real problems.
Viliam Bur encapsulated what I was trying to establish pretty well: “I think the idea was that perhaps for some alien intelligence the “+” symbol could be useless or even meaningless, and something else would be in the place of “the most simple abstract computational operation”. Then the aliens could naively expect that every intelligence in the universe must know this very basic operation.”
But more than that—if your basic operations are different, it’s possible to come to very different conclusions.
One of my biggest revelations in mathematics was in statistics, when, after the class (including me) worked unsuccessfully for a couple of hours to integrate an equation, the instructor (who I’m sure was laughing at us) walked up to the board, converted into a different coordinate system, and integrated the now very easily integrated equation in about thirty seconds.
If your basic operations are different, you might be able to come to conclusions you otherwise were unable to come to.
I have yet to find a concept that does not fit the description ‘may not be meaningful outside a limited domain.’
And in response to this I’ll ask: How many people accept this about the fundamental descriptors they use in their mathematics? How many people operate on the assumption that mathematics are a universal language, or that the universe runs on mathematics (which I generally interpret to mean -their- mathematics)?
You can’t add x+y and get a delicious pie as a result.
And yet most recipes follow a basically arithmetic formula: Add 5 units of meat, add 375 units of heat over 10 units of time.
The point, although it’s a long way around to coming, is that arithmetic may be a fundamentally -human- way of evaluating the universe. It goes without saying that it’s not the ideal model in many scenarios. And for those considering how to build AI, particularly those interested in solving intractable problems, it may be worth letting it come to its own model.
I am using counting to refer to any process by which a number is assigned as a symbol for a property.
I use + in both its concrete—this field now contains two sheep—and its abstract—this set now contains two sheep—meanings. I hope the context makes it clear when I am using each meaning, and why; because the lack of clarity is, in fact, important. See my response to the first comment, in which I deliberately used its concrete meaning in response to somebody using the abstract meaning. Both of us are in fact correct, and the confusion itself is meaningful.
Because it -is- a linguistic problem—and because linguistic problems can, in fact, be real problems.
Viliam Bur encapsulated what I was trying to establish pretty well: “I think the idea was that perhaps for some alien intelligence the “+” symbol could be useless or even meaningless, and something else would be in the place of “the most simple abstract computational operation”. Then the aliens could naively expect that every intelligence in the universe must know this very basic operation.”
But more than that—if your basic operations are different, it’s possible to come to very different conclusions.
One of my biggest revelations in mathematics was in statistics, when, after the class (including me) worked unsuccessfully for a couple of hours to integrate an equation, the instructor (who I’m sure was laughing at us) walked up to the board, converted into a different coordinate system, and integrated the now very easily integrated equation in about thirty seconds.
If your basic operations are different, you might be able to come to conclusions you otherwise were unable to come to.
I have yet to find a concept that does not fit the description ‘may not be meaningful outside a limited domain.’
And in response to this I’ll ask: How many people accept this about the fundamental descriptors they use in their mathematics? How many people operate on the assumption that mathematics are a universal language, or that the universe runs on mathematics (which I generally interpret to mean -their- mathematics)?
You can’t add x+y and get a delicious pie as a result.
And yet most recipes follow a basically arithmetic formula: Add 5 units of meat, add 375 units of heat over 10 units of time.
The point, although it’s a long way around to coming, is that arithmetic may be a fundamentally -human- way of evaluating the universe. It goes without saying that it’s not the ideal model in many scenarios. And for those considering how to build AI, particularly those interested in solving intractable problems, it may be worth letting it come to its own model.