That “0.99999....” represents a concept that evaluates to 1 is a question of notation, not mathematics. 0.99999… does not inherently equal 1; rather, by convention, it is understood to mean 1. The debate is not about the territory, it is about what the symbols on the map mean.
Where does one draw the line, if at all? “1+1 does no inherently equal 2; rather, by convention, it is understood to mean 2. The debate is not about the territory, it is about what the symbols on the map mean.” It seems to me like that—very ‘mysteriously’—people who understand real analysis never complain “But 0.999… doesn’t equal 1″; sufficient mathematical literacy seems to kill any such impulse, which seems very telling to me.
Yes, and that’s a case of “you don’t understand mathematics, you get used to it.” Which applies exactly to notation and related conventions.
Edit:
More specifically, if we let a_k=9/10^k, and let s_n be the sum from k=1 to n of a_k, then the limit of s_n as n goes to infinity will be 1, but 1 won’t be in {s_n|n in R}.
When somebody who is used to calculus sees ”.99...” What they are thinking of is the limit, which is 1.
But before you get used to that, most likely what you think of is some member of {s_n|n in R} with an n that’s large enough that you can’t be bothered to write all the nines, but which is still finite.
Exactly. The arguments about whether 0.99999.… = 1 are lacking a crucial item: a rigorous definition of what “0.9999...” refers to. The argument isn’t “Is the limit as n goes to infinity of sum from 1 to n of 9*10^-n equal to 1?” It’s “Here’s a sequence of symbols. Should we assign this sequence of symbols the value of 1, or not?” Which is just a silly argument to have. If someone says “I don’t believe that 0.9999.… = 1“, the correct response (unless they have sufficient real analysis background) is not “Well, here’s a proof that of that claim”, it’s “Well, there are various axioms and definitions that lead to that being treated as being equal to 1”.
It’s “Here’s a sequence of symbols. Should we assign this sequence of symbols the value of 1, or not?” Which is just a silly argument to have.
It’s not. The “0.999… doesn’t equal 1” meme is largely crackpottery, and promotes amateur overconfidence and (arguably) mathematical illiteracy.
Terms are precious real estate, and their interpretations really are valuable. Our thought processes and belief networks are sticky; if someone has a crap interpretation of a term, then it will at best cause unnecessary friction in using it (e.g. if you define the natural numbers to include −1,...,-10 and have to retranslate theorems because of this), and at worst one will lose track of the translation between interpretations and end up propagating false statements (“2^n can sometimes be less than 2 for n natural”)
the correct response (unless they have sufficient real analysis background) is not “Well, here’s a proof that of that claim”, it’s “Well, there are various axioms and definitions that lead to that being treated as being equal to 1″.
It would be an accurate response (even if not the most pragmatic or tactful) to say, “Sorry, when you pin down what’s meant precisely, it turns out to be a much more useful convention to define the proposition 0.999...=1 such that it is true, and you basically have to perform mental gymnastics to try to justify any usage where it’s not true. There are technically alternative schemas where this could fail or be incoherent or whatever, but unless you go several years into studying math (and even then maybe only if you become a logician or model theorist or something), those are not what you’ll be encountering.”
One could define ‘marble’ to mean ‘nucleotide’. But I think that somebody who looked down on a geneticist for complaining about people using ‘marble’ as if it means ‘nucleotide’, and who said it was a silly argument as if the geneticist and the person who invented the new definition were Just As Bad As Each Other, would be mistaken, and I would suspect they were more interested in signalling their Cleverness via relativist metacontrarianism than getting their hands dirty figuring out the empirical question of which definitions are useful in which contexts.
Actually, I could imagine you reading that comment and feeling it still misses your point that 0.999… is undefined or has different definitions or senses in amateur discussions. In that case, I would point to the idea that one can makes propositions about a primitive concept that turn out to be false about the mature form of it. One could make claims about evidence, causality, free will, knowledge, numbers, gravity, light, etc. that would be true under one primitive sense and false under another. Then minutes or days or month or years or centuries or millennia later it turns out that the claims were false about the correct definition.
It would be a sin of rationality to assume that, since there was a controversy over definitions, and some definitions proved the claim and some disproved it, that no side was more right than another. One should study examples of where people made correct claims about fuzzy concepts, to see what we might learn in our own lives about how these things resolve. Were there hints that the people who turned out to be incorrect ignored? Did they fail to notice their confusion? Telltale features of the problem that favoured a different interpretation? etc.
It’s not. The “0.999… doesn’t equal 1” meme is largely crackpottery
A lot (in fact, all of them that don’t involve a rigorous treatment of infinite series) of the “proofs” that it does equal 1 are fallacious, and so the refusal to accept them is actually a reasonable response.
You seem to making an assertion about me in your last paragraph, but doing so very obliquely. Your analogy is not very good, as people do not try to argue that one can logically prove that “marble” does not mean “nucleotide”, they just say that it is defined otherwise.
If we’re analogizing ”.9999… = 1″ to “marble doesn’t mean’t nucleotide”, then ”
You seem to making an assertion about me in your last paragraph, but doing so very obliquely.
Apologies for that. I don’t think that that specific failure mode is particularly likely in your case, but it seems plausible to me that other people thinking in that way has shifted the terms of discourse such that that form of linguistic relativism is seen as high-status by a lot of smart people. I am more mentioning it to highlight the potential failure mode; if part of why you hold your position is that it seems like the kind of position that smart people would hold, but I can account for those smart people holding it in terms of metacontrarianism, then that partially screens off that reason for endorsing the smart people’s argument.
It looks like you submitted your comment before you meant to, so I shall probably await its completion before commenting on the rest.
Mathematical arguments happen all the time over whether 0.99999...=1 but I’m not sure if that’s interesting enough to count for what you want.
That “0.99999....” represents a concept that evaluates to 1 is a question of notation, not mathematics. 0.99999… does not inherently equal 1; rather, by convention, it is understood to mean 1. The debate is not about the territory, it is about what the symbols on the map mean.
Where does one draw the line, if at all? “1+1 does no inherently equal 2; rather, by convention, it is understood to mean 2. The debate is not about the territory, it is about what the symbols on the map mean.” It seems to me like that—very ‘mysteriously’—people who understand real analysis never complain “But 0.999… doesn’t equal 1″; sufficient mathematical literacy seems to kill any such impulse, which seems very telling to me.
Yes, and that’s a case of “you don’t understand mathematics, you get used to it.” Which applies exactly to notation and related conventions.
Edit:
More specifically, if we let a_k=9/10^k, and let s_n be the sum from k=1 to n of a_k, then the limit of s_n as n goes to infinity will be 1, but 1 won’t be in {s_n|n in R}.
When somebody who is used to calculus sees ”.99...” What they are thinking of is the limit, which is 1.
But before you get used to that, most likely what you think of is some member of {s_n|n in R} with an n that’s large enough that you can’t be bothered to write all the nines, but which is still finite.
Exactly. The arguments about whether 0.99999.… = 1 are lacking a crucial item: a rigorous definition of what “0.9999...” refers to. The argument isn’t “Is the limit as n goes to infinity of sum from 1 to n of 9*10^-n equal to 1?” It’s “Here’s a sequence of symbols. Should we assign this sequence of symbols the value of 1, or not?” Which is just a silly argument to have. If someone says “I don’t believe that 0.9999.… = 1“, the correct response (unless they have sufficient real analysis background) is not “Well, here’s a proof that of that claim”, it’s “Well, there are various axioms and definitions that lead to that being treated as being equal to 1”.
It’s not. The “0.999… doesn’t equal 1” meme is largely crackpottery, and promotes amateur overconfidence and (arguably) mathematical illiteracy.
Terms are precious real estate, and their interpretations really are valuable. Our thought processes and belief networks are sticky; if someone has a crap interpretation of a term, then it will at best cause unnecessary friction in using it (e.g. if you define the natural numbers to include −1,...,-10 and have to retranslate theorems because of this), and at worst one will lose track of the translation between interpretations and end up propagating false statements (“2^n can sometimes be less than 2 for n natural”)
It would be an accurate response (even if not the most pragmatic or tactful) to say, “Sorry, when you pin down what’s meant precisely, it turns out to be a much more useful convention to define the proposition 0.999...=1 such that it is true, and you basically have to perform mental gymnastics to try to justify any usage where it’s not true. There are technically alternative schemas where this could fail or be incoherent or whatever, but unless you go several years into studying math (and even then maybe only if you become a logician or model theorist or something), those are not what you’ll be encountering.”
One could define ‘marble’ to mean ‘nucleotide’. But I think that somebody who looked down on a geneticist for complaining about people using ‘marble’ as if it means ‘nucleotide’, and who said it was a silly argument as if the geneticist and the person who invented the new definition were Just As Bad As Each Other, would be mistaken, and I would suspect they were more interested in signalling their Cleverness via relativist metacontrarianism than getting their hands dirty figuring out the empirical question of which definitions are useful in which contexts.
Actually, I could imagine you reading that comment and feeling it still misses your point that 0.999… is undefined or has different definitions or senses in amateur discussions. In that case, I would point to the idea that one can makes propositions about a primitive concept that turn out to be false about the mature form of it. One could make claims about evidence, causality, free will, knowledge, numbers, gravity, light, etc. that would be true under one primitive sense and false under another. Then minutes or days or month or years or centuries or millennia later it turns out that the claims were false about the correct definition.
It would be a sin of rationality to assume that, since there was a controversy over definitions, and some definitions proved the claim and some disproved it, that no side was more right than another. One should study examples of where people made correct claims about fuzzy concepts, to see what we might learn in our own lives about how these things resolve. Were there hints that the people who turned out to be incorrect ignored? Did they fail to notice their confusion? Telltale features of the problem that favoured a different interpretation? etc.
A lot (in fact, all of them that don’t involve a rigorous treatment of infinite series) of the “proofs” that it does equal 1 are fallacious, and so the refusal to accept them is actually a reasonable response.
You seem to making an assertion about me in your last paragraph, but doing so very obliquely. Your analogy is not very good, as people do not try to argue that one can logically prove that “marble” does not mean “nucleotide”, they just say that it is defined otherwise.
If we’re analogizing ”.9999… = 1″ to “marble doesn’t mean’t nucleotide”, then ”
Apologies for that. I don’t think that that specific failure mode is particularly likely in your case, but it seems plausible to me that other people thinking in that way has shifted the terms of discourse such that that form of linguistic relativism is seen as high-status by a lot of smart people. I am more mentioning it to highlight the potential failure mode; if part of why you hold your position is that it seems like the kind of position that smart people would hold, but I can account for those smart people holding it in terms of metacontrarianism, then that partially screens off that reason for endorsing the smart people’s argument.
It looks like you submitted your comment before you meant to, so I shall probably await its completion before commenting on the rest.
And yet I somehow doubt most of these people reject connectedness.