1 number of length 0, 9 numbers of length 1 (and maybe 0), 90 numbers of length 2, 900 numbers of length 3, 9000 numbers of length 4
9*10^(n-1) numbers of length n. For each n the amount of numbers of length just before that is 10 times less and the amount of numbers the next length is 10 times more. If you take a rolling fraction of n odd to all numbers seen it starts to go down when even numbered length is reached and starts to go up when an odd length number is reached.
(Ignoring that most people don’t think of 0 as of being length 0.)
Jump by two orders of magnitude every time and it stays stable:
Starting with nothing:
1 of even length, 9 of odd length.
90 of even length, 900 of odd length.
10% versus 90%.
Start after an even jump:
91 of even length, 9 of odd length.
9,091 of even length, 909 of odd length.
(Starts at 91% even, but drops after a double jump. I don’t know what the limit on this is.)
By comparison, resolving proportion of even numbers versus odd numbers, is much easier, because it’s a simple pattern which oscillates at the same rate, instead of changing.
9*10^(n-1) numbers of length n.
(in base 10)
Well, if a different color is used every time, then the coloring aspect is solved. If you ask about addition though, then things get weird.
Right, that was a math typo. It really oscillates between 9% and 91%.
For example, 909090 of the numbers below a million have even length, i.e. 91%. As you increase the bound toward ten million, this fraction decreases until it hits a minimum of 9%, and then starts increasing again until you reach a hundred million, and so on.
Well, the old solution to what is the limit of: +1, −1, +1, −1, etc. was: (index starts at one, pattern is (-1)^(n+1))
Consider the cases:
a) +1, odd index
b) −1, even index
Average them.
0.
If that was applied directly, it’d be: (9+91)/2% = 50%.
You could argue that it should be broken down differently, because there’s different proportions here though.
You could also declare the answer undefined, and say infinity is about growth, it doesn’t have a value, for x % 2 (or odd or even number as the case may be), and averages are ridiculous. (And once you have a breakdown of cases, and probability what more is there?)
That is one of the puzzle in that 0+0+0+0+0… converges and has a value but +1-1+1-1+1-1… which seems to be like (1-1)+(1-1)+(1-1)+(1-1)… diverges (and the series with and without the paranthesis are not equivalent)
The strram idea gives it a bit more wiggleroom. Getting 1,0,1,0,1.. fish seems equivalent to getting 1⁄2 fish a day but 1,1,1,1,1.. seems twice the fish of 1,0,1,0,1,0,1,0… So which with the other methods are “can’t say anthing” there is maybe hope to capture more cases with this kind of approach.
Too bad its not super formal and I can’t even pinpoint where the painpoints for formalization would be.
What?
1 number of length 0, 9 numbers of length 1
(and maybe 0), 90 numbers of length 2, 900 numbers of length 3, 9000 numbers of length 49*10^(n-1) numbers of length n. For each n the amount of numbers of length just before that is 10 times less and the amount of numbers the next length is 10 times more. If you take a rolling fraction of n odd to all numbers seen it starts to go down when even numbered length is reached and starts to go up when an odd length number is reached.
(Ignoring that most people don’t think of 0 as of being length 0.)
Jump by two orders of magnitude every time and it stays stable:
Starting with nothing:
1 of even length, 9 of odd length.
90 of even length, 900 of odd length.
10% versus 90%.
Start after an even jump:
91 of even length, 9 of odd length.
9,091 of even length, 909 of odd length.
(Starts at 91% even, but drops after a double jump. I don’t know what the limit on this is.)
By comparison, resolving proportion of even numbers versus odd numbers, is much easier, because it’s a simple pattern which oscillates at the same rate, instead of changing.
(in base 10)
Well, if a different color is used every time, then the coloring aspect is solved. If you ask about addition though, then things get weird.
Right, that was a math typo. It really oscillates between 9% and 91%.
For example, 909090 of the numbers below a million have even length, i.e. 91%. As you increase the bound toward ten million, this fraction decreases until it hits a minimum of 9%, and then starts increasing again until you reach a hundred million, and so on.
Well, the old solution to what is the limit of: +1, −1, +1, −1, etc. was: (index starts at one, pattern is (-1)^(n+1))
Consider the cases:
a) +1, odd index
b) −1, even index
Average them.
0.
If that was applied directly, it’d be: (9+91)/2% = 50%.
You could argue that it should be broken down differently, because there’s different proportions here though.
You could also declare the answer undefined, and say infinity is about growth, it doesn’t have a value, for x % 2 (or odd or even number as the case may be), and averages are ridiculous. (And once you have a breakdown of cases, and probability what more is there?)
That is one of the puzzle in that 0+0+0+0+0… converges and has a value but +1-1+1-1+1-1… which seems to be like (1-1)+(1-1)+(1-1)+(1-1)… diverges (and the series with and without the paranthesis are not equivalent)
The strram idea gives it a bit more wiggleroom. Getting 1,0,1,0,1.. fish seems equivalent to getting 1⁄2 fish a day but 1,1,1,1,1.. seems twice the fish of 1,0,1,0,1,0,1,0… So which with the other methods are “can’t say anthing” there is maybe hope to capture more cases with this kind of approach.
Too bad its not super formal and I can’t even pinpoint where the painpoints for formalization would be.