What is the notion of complexity in question? It could for instance be the (hypothetically) shortest program needed to produce a given object, i.e. Kolmogorov complexity.
In that case, the natural numbers would have a complexity of infinity, which would be much greater than any finite quantity—i.e. a human life.
I may be missing something because the discussion to my eyes seems trivial.
Yes, but how are you going to represent ‘n’ under the hood?
You are going to need eventually infinite bits to represent it? I guess this is what you mean by storage. I should confess that I don’t know enough about alogrithmic information theory so I may be in deeper waters than I can swim. I think you are right though…
I had something more in mind like, the number of bits required to represent any natural number, which is obviously log(n) (or maybe 2loglog(n) - with some clever tricks I think) and if n can get as big as possible, then the complexity, log(n) also gets arbitrarily big.
So maybe the problem of producing every natural number consecutively has a different complexity from producing some arbitrary natural number. Interesting…
Someone else should be the one to say this (do we have an information theorist in the house?), but my understanding is that Kolmogov complexity does not account for memory usage problems (e.g. by using Turing machines with infinite tape). And thus producing a single specific sufficiently large arbitrary natural number is more complex than producing the entire list—because “sufficiently” in this case is “longer than the program which produces the entire list”.
Yup, Kolmogorov complexity is only concerned with the length of the shortest algorithm. There are other measures (more rarely used, it seems) that take into account things like memory used, or time (number of steps), though I can’t remember their name just now.
Note that usually the complexity of X is the size of the program that outputs X exactly, not the program that outputs a lot of things including X. Otherwise you can write a quite short program that outputs, say, all possible ascii texts, and claim that it’s size is an upper bound of the complexity of the Bible. Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts.
Example in Python:
chars = '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'+\
'"!#$%&\'()*+,-./:;<=>?@[\\]^_`{|}~ \t\n\r'
def iter_strings_of_size(n):
if n <= 0:
yield ''
else:
for string in iter_strings_of_size(n-1):
for char in chars:
yield string + char
def iter_all_strings():
n = 0
while True:
for string in iter_strings_of_size(n):
yield string
n = n + 1
Understanding the Power of Yield was a great step forwards for me, afterwards I was horrified to reread some old code that was riddled with horrible classes like DoubleItemIterator and IteratorFilter that my C++-addled brain had cooked up, and realized half my classes were useless and the rest could have there linecount divided by ten.
And some people still cound “lines of code” as a measure of productivity. Sob.
I actually took information theory but this is more of an issue algorithmic information theory—something I have not studied all that much. Though still, I think you are probably right since Kolgomorov complexity refers to descriptive complexity of an object. And here you can give a much shorter description of all of consecutive natural numbers.
This is very interesting to me because intuitively one would think that both are problems involving infinity and hence I lazily thought that they would both have the same complexity.
What is the notion of complexity in question? It could for instance be the (hypothetically) shortest program needed to produce a given object, i.e. Kolmogorov complexity.
In that case, the natural numbers would have a complexity of infinity, which would be much greater than any finite quantity—i.e. a human life.
I may be missing something because the discussion to my eyes seems trivial.
The complexity doesn’t count the amount of data storage required, only the length of the executable code.
looks simple to me.
Yes, but how are you going to represent ‘n’ under the hood? You are going to need eventually infinite bits to represent it? I guess this is what you mean by storage. I should confess that I don’t know enough about alogrithmic information theory so I may be in deeper waters than I can swim. I think you are right though…
I had something more in mind like, the number of bits required to represent any natural number, which is obviously log(n) (or maybe 2loglog(n) - with some clever tricks I think) and if n can get as big as possible, then the complexity, log(n) also gets arbitrarily big.
So maybe the problem of producing every natural number consecutively has a different complexity from producing some arbitrary natural number. Interesting…
Someone else should be the one to say this (do we have an information theorist in the house?), but my understanding is that Kolmogov complexity does not account for memory usage problems (e.g. by using Turing machines with infinite tape). And thus producing a single specific sufficiently large arbitrary natural number is more complex than producing the entire list—because “sufficiently” in this case is “longer than the program which produces the entire list”.
Yup, Kolmogorov complexity is only concerned with the length of the shortest algorithm. There are other measures (more rarely used, it seems) that take into account things like memory used, or time (number of steps), though I can’t remember their name just now.
Note that usually the complexity of X is the size of the program that outputs X exactly, not the program that outputs a lot of things including X. Otherwise you can write a quite short program that outputs, say, all possible ascii texts, and claim that it’s size is an upper bound of the complexity of the Bible. Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts.
Example in Python:
This has improved my understanding of the Python “yield” statement.
Glad I could be of use !
Understanding the Power of Yield was a great step forwards for me, afterwards I was horrified to reread some old code that was riddled with horrible classes like DoubleItemIterator and IteratorFilter that my C++-addled brain had cooked up, and realized half my classes were useless and the rest could have there linecount divided by ten.
And some people still cound “lines of code” as a measure of productivity. Sob.
So that’s why my self-enhancing AI keeps getting bogged down!
Voted up for being really funny.
“Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts.”
Or to locate it in “The Library of Babel”.
I actually took information theory but this is more of an issue algorithmic information theory—something I have not studied all that much. Though still, I think you are probably right since Kolgomorov complexity refers to descriptive complexity of an object. And here you can give a much shorter description of all of consecutive natural numbers.
This is very interesting to me because intuitively one would think that both are problems involving infinity and hence I lazily thought that they would both have the same complexity.