Yes, I totally agree with you: consistency and convenience are why we have chosen to use 1.9999… notation to represent the limit, rather than the sequence.
consistency and convenience tends to drive most mathematical notational choices (with occasional other influences), for reasons that should be extremely obvious.
It just so happened that, o this occasion, I was not aware enough of either the actual convention, or of other “things that this notation would be consistent with” before I guessed at the meaning of this particular item of notation.
And so my guessed meaning was one of the two things that I thought would be “likely meanings” for the notation.
In this case, my guess was for the wrong one of the two.
I seem to be getting a lot of comments that are implying that I should have somehow naturally realised which of the two meanings was “correct”… and have tried very hard to explain why it is not obvious, and not somehow inevitable.
Both of my possible interpretations were potentially valid, and I’d like to insist that the sequence-one is wrong only by convention (ie maths has to pick one or the other meaning… it happens to be the most convenient for mathematicians, which happens in this case to be the limit-interpretation)… but as is clearly evidenced by the fact that there is so much confusion around the subject (ref the wikipedia page) - it is not obvious intuitively that one is “correct” and one is “not correct”.
I maintain that without knowledge of the convention, you cannot know which is the “correct” interpretation. Any assumption otherwise is simply hindsight bias.
it is not obvious intuitively that one is “correct” and one is “not correct”.
There is no inherent meaning to a set of symbols scrawled on paper. There is no “correct” and “incorrect” way of interpreting it; only convention (unless your goal is to communicate with others). There is no Platonic Ideal of Mathematical Notation, so obviously there is no objective way to pluck the “correct” interpretation of some symbols out of the interstellar void. You are right in as far as you say that.
However, you are expected to know the meaning of the notation you use in exactly the same way that you are expected to know the meaning of the words you use. Not knowing is understandable, but observing that it is possible to not-know a convention is not a particular philosophical insight.
People guess the meanings of words and notations from context all the time. Especially when they aren’t specialists in the field in question. Lots of interested amateurs exist and read things without the benefit of years of training before hand.
Some things just lend themselves more easily to guessing the accepted-meaning than others. It is often a good idea to make things easier to guess the accepted-meaning, rather than to fail to do so, if at all possible. Make it hard to fail.
Yes, I totally agree with you: consistency and convenience are why we have chosen to use 1.9999… notation to represent the limit, rather than the sequence.
consistency and convenience tends to drive most mathematical notational choices (with occasional other influences), for reasons that should be extremely obvious.
It just so happened that, o this occasion, I was not aware enough of either the actual convention, or of other “things that this notation would be consistent with” before I guessed at the meaning of this particular item of notation.
And so my guessed meaning was one of the two things that I thought would be “likely meanings” for the notation.
In this case, my guess was for the wrong one of the two.
I seem to be getting a lot of comments that are implying that I should have somehow naturally realised which of the two meanings was “correct”… and have tried very hard to explain why it is not obvious, and not somehow inevitable.
Both of my possible interpretations were potentially valid, and I’d like to insist that the sequence-one is wrong only by convention (ie maths has to pick one or the other meaning… it happens to be the most convenient for mathematicians, which happens in this case to be the limit-interpretation)… but as is clearly evidenced by the fact that there is so much confusion around the subject (ref the wikipedia page) - it is not obvious intuitively that one is “correct” and one is “not correct”.
I maintain that without knowledge of the convention, you cannot know which is the “correct” interpretation. Any assumption otherwise is simply hindsight bias.
There is no inherent meaning to a set of symbols scrawled on paper. There is no “correct” and “incorrect” way of interpreting it; only convention (unless your goal is to communicate with others). There is no Platonic Ideal of Mathematical Notation, so obviously there is no objective way to pluck the “correct” interpretation of some symbols out of the interstellar void. You are right in as far as you say that.
However, you are expected to know the meaning of the notation you use in exactly the same way that you are expected to know the meaning of the words you use. Not knowing is understandable, but observing that it is possible to not-know a convention is not a particular philosophical insight.
People guess the meanings of words and notations from context all the time. Especially when they aren’t specialists in the field in question. Lots of interested amateurs exist and read things without the benefit of years of training before hand.
Some things just lend themselves more easily to guessing the accepted-meaning than others. It is often a good idea to make things easier to guess the accepted-meaning, rather than to fail to do so, if at all possible. Make it hard to fail.