Just another example of a otherwise-respectable (though not by me) economist spouting nonsense. I thought you guys might find it interesting, and it seemed short for a top-level post.
Steven Landsburg has a new book out and a blog for it. In a post about arguments for/against God, he says this:
the most complex thing I’m aware of is the system of natural numbers (0,1,2,3, and all the rest of them) together with the laws of arithmetic. That system did not emerge, by gradual degrees, from simpler beginnings.
If you doubt the complexity of the natural numbers, take note that you can use just a small part of them to encode the entire human genome. That makes the natural numbers more complex than human life.
So how many whoppers is that? Let’s see: the max-compressed encoding of the human genome is insufficient data to describe the working of human life. The natural numbers and operations thereon are extremely simple because it takes very little to describe how they work. This complexity is not the same as the complexity of a specific model implemented with the natural numbers.
His description of it as emerging all at once is just confused: yes, people use natural numbers to describe nature, but this is not the same as saying that the modeling usefulness emerged all at once, which is the sense in which he was originally using the term.
What’s scary is he supposedly teaches more math than economics.
Disclosure: Landsburg’s wife banned me from econlog.econlib.org a few years ago.
I’m probably exposing my ignorance here, but didn’t zero have a historical evolution, so to speak? I’m going off vague memories of past reading and a current quick glance at wikipedia, but it seems like there were separate developments of using place holders, the concept of nothing, and the use of a symbol, which all eventually converges onto the current zero. Seems like the evolution of a number to me. And it may be a just so story, but I see it as eminently plausible that humans primarily work in base 10 because, for the most part, we have 10 digits, which again would be dictated by the evolutionary process.
On his human life, point, if DNA encoding encompasses all of complex numbers (being that it needs that system in order to be described), isn’t it then necessarily more complex, since it requires all of complex numbers plus it’s own set of rules and knowledge base as well?
The ban was probably for the best Silas, you were probably confusing everyone with the facts.
And it may be a just so story, but I see it as eminently plausible that humans primarily work in base 10 because, for the most part, we have 10 digits, which again would be dictated by the evolutionary process.
It sounds like a true story (note etymology of the word “digit”). But lots of human cultures used other bases (some of them still exist). Wikipedia lists examples of bases 4, 5, 8, 12, 15, 20, 24, 27, 32 and 60. Many of these have a long history and are (or were) fully integrated into their originating language and culture. So the claim that “humans work in base 10 because we have 10 digits” is rather too broad—it’s at least partly a historical accident that base 10 came to be used by European cultures which later conquered most of the world.
That’s a good point, Dan. I guess we’d have to check what the number of base 10 systems were vs. overall systems. Though I would continue to see that as again demonstrating an evolution of complex number theory, as multiple strands joined together as systems interacted with one another. There were probably plenty of historical accidents at work, like you mention, to help bring about the current system of natural numbers.
I’m probably exposing my ignorance here, but didn’t zero have a historical evolution, so to speak?
Your recollection is correct: the understanding of math developed gradually. My criticism of Landsburg was mainly that he’s not even using a consistent definition of math.
And as you note, under reasonable definitions of math, it did develop gradually.
On his human life, point, if DNA encoding encompasses all of complex numbers (being that it needs that system in order to be described), isn’t it then necessarily more complex, since it requires all of complex numbers plus it’s own set of rules and knowledge base as well?
Yes, exactly. That’s why human life is more complex than the string representing the genome: you also have to know what that (compressed) genome specification refers to, the chemical interactions involved, etc.
The ban was probably for the best Silas, you were probably confusing everyone with the facts.
On his human life, point, if DNA encoding encompasses all of complex numbers (being that it needs that system in order to be described), isn’t it then necessarily more complex, since it requires all of complex numbers plus it’s own set of rules and knowledge base as well?
Why does DNA encoding need complex numbers? I’m pretty sure simple integers are enough… Maybe you meant the “complexity of natural numbers” as quoted?
UPDATE: Landsburg replies to me several times on my blog. I had missed the window for comments, but Bob Murphy posted a reply to Landsburg on his (Murphy’s) blog, and I expanded my points in the linked post, which drew Landsburg.
What is the notion of complexity in question? It could for instance be the (hypothetically) shortest program needed to produce a given object, i.e. Kolmogorov complexity.
In that case, the natural numbers would have a complexity of infinity, which would be much greater than any finite quantity—i.e. a human life.
I may be missing something because the discussion to my eyes seems trivial.
Yes, but how are you going to represent ‘n’ under the hood?
You are going to need eventually infinite bits to represent it? I guess this is what you mean by storage. I should confess that I don’t know enough about alogrithmic information theory so I may be in deeper waters than I can swim. I think you are right though…
I had something more in mind like, the number of bits required to represent any natural number, which is obviously log(n) (or maybe 2loglog(n) - with some clever tricks I think) and if n can get as big as possible, then the complexity, log(n) also gets arbitrarily big.
So maybe the problem of producing every natural number consecutively has a different complexity from producing some arbitrary natural number. Interesting…
Someone else should be the one to say this (do we have an information theorist in the house?), but my understanding is that Kolmogov complexity does not account for memory usage problems (e.g. by using Turing machines with infinite tape). And thus producing a single specific sufficiently large arbitrary natural number is more complex than producing the entire list—because “sufficiently” in this case is “longer than the program which produces the entire list”.
Yup, Kolmogorov complexity is only concerned with the length of the shortest algorithm. There are other measures (more rarely used, it seems) that take into account things like memory used, or time (number of steps), though I can’t remember their name just now.
Note that usually the complexity of X is the size of the program that outputs X exactly, not the program that outputs a lot of things including X. Otherwise you can write a quite short program that outputs, say, all possible ascii texts, and claim that it’s size is an upper bound of the complexity of the Bible. Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts.
Example in Python:
chars = '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'+\
'"!#$%&\'()*+,-./:;<=>?@[\\]^_`{|}~ \t\n\r'
def iter_strings_of_size(n):
if n <= 0:
yield ''
else:
for string in iter_strings_of_size(n-1):
for char in chars:
yield string + char
def iter_all_strings():
n = 0
while True:
for string in iter_strings_of_size(n):
yield string
n = n + 1
Understanding the Power of Yield was a great step forwards for me, afterwards I was horrified to reread some old code that was riddled with horrible classes like DoubleItemIterator and IteratorFilter that my C++-addled brain had cooked up, and realized half my classes were useless and the rest could have there linecount divided by ten.
And some people still cound “lines of code” as a measure of productivity. Sob.
I actually took information theory but this is more of an issue algorithmic information theory—something I have not studied all that much. Though still, I think you are probably right since Kolgomorov complexity refers to descriptive complexity of an object. And here you can give a much shorter description of all of consecutive natural numbers.
This is very interesting to me because intuitively one would think that both are problems involving infinity and hence I lazily thought that they would both have the same complexity.
Just another example of a otherwise-respectable (though not by me) economist spouting nonsense. I thought you guys might find it interesting, and it seemed short for a top-level post.
Steven Landsburg has a new book out and a blog for it. In a post about arguments for/against God, he says this:
So how many whoppers is that? Let’s see: the max-compressed encoding of the human genome is insufficient data to describe the working of human life. The natural numbers and operations thereon are extremely simple because it takes very little to describe how they work. This complexity is not the same as the complexity of a specific model implemented with the natural numbers.
His description of it as emerging all at once is just confused: yes, people use natural numbers to describe nature, but this is not the same as saying that the modeling usefulness emerged all at once, which is the sense in which he was originally using the term.
What’s scary is he supposedly teaches more math than economics.
Disclosure: Landsburg’s wife banned me from econlog.econlib.org a few years ago.
UPDATE2: Landsburg responds to my criticism on his blog, though without mentioning me :-(
I’m probably exposing my ignorance here, but didn’t zero have a historical evolution, so to speak? I’m going off vague memories of past reading and a current quick glance at wikipedia, but it seems like there were separate developments of using place holders, the concept of nothing, and the use of a symbol, which all eventually converges onto the current zero. Seems like the evolution of a number to me. And it may be a just so story, but I see it as eminently plausible that humans primarily work in base 10 because, for the most part, we have 10 digits, which again would be dictated by the evolutionary process.
On his human life, point, if DNA encoding encompasses all of complex numbers (being that it needs that system in order to be described), isn’t it then necessarily more complex, since it requires all of complex numbers plus it’s own set of rules and knowledge base as well?
The ban was probably for the best Silas, you were probably confusing everyone with the facts.
It sounds like a true story (note etymology of the word “digit”). But lots of human cultures used other bases (some of them still exist). Wikipedia lists examples of bases 4, 5, 8, 12, 15, 20, 24, 27, 32 and 60. Many of these have a long history and are (or were) fully integrated into their originating language and culture. So the claim that “humans work in base 10 because we have 10 digits” is rather too broad—it’s at least partly a historical accident that base 10 came to be used by European cultures which later conquered most of the world.
That’s a good point, Dan. I guess we’d have to check what the number of base 10 systems were vs. overall systems. Though I would continue to see that as again demonstrating an evolution of complex number theory, as multiple strands joined together as systems interacted with one another. There were probably plenty of historical accidents at work, like you mention, to help bring about the current system of natural numbers.
Your recollection is correct: the understanding of math developed gradually. My criticism of Landsburg was mainly that he’s not even using a consistent definition of math.
And as you note, under reasonable definitions of math, it did develop gradually.
Yes, exactly. That’s why human life is more complex than the string representing the genome: you also have to know what that (compressed) genome specification refers to, the chemical interactions involved, etc.
:-)
Why does DNA encoding need complex numbers? I’m pretty sure simple integers are enough… Maybe you meant the “complexity of natural numbers” as quoted?
Sounds good to me (that’s what I get for typing quickly at work).
UPDATE: Landsburg replies to me several times on my blog. I had missed the window for comments, but Bob Murphy posted a reply to Landsburg on his (Murphy’s) blog, and I expanded my points in the linked post, which drew Landsburg.
Entertaining read. I’m a Landsburg fan, but he’s stepped in it on this one.
What is the notion of complexity in question? It could for instance be the (hypothetically) shortest program needed to produce a given object, i.e. Kolmogorov complexity.
In that case, the natural numbers would have a complexity of infinity, which would be much greater than any finite quantity—i.e. a human life.
I may be missing something because the discussion to my eyes seems trivial.
The complexity doesn’t count the amount of data storage required, only the length of the executable code.
looks simple to me.
Yes, but how are you going to represent ‘n’ under the hood? You are going to need eventually infinite bits to represent it? I guess this is what you mean by storage. I should confess that I don’t know enough about alogrithmic information theory so I may be in deeper waters than I can swim. I think you are right though…
I had something more in mind like, the number of bits required to represent any natural number, which is obviously log(n) (or maybe 2loglog(n) - with some clever tricks I think) and if n can get as big as possible, then the complexity, log(n) also gets arbitrarily big.
So maybe the problem of producing every natural number consecutively has a different complexity from producing some arbitrary natural number. Interesting…
Someone else should be the one to say this (do we have an information theorist in the house?), but my understanding is that Kolmogov complexity does not account for memory usage problems (e.g. by using Turing machines with infinite tape). And thus producing a single specific sufficiently large arbitrary natural number is more complex than producing the entire list—because “sufficiently” in this case is “longer than the program which produces the entire list”.
Yup, Kolmogorov complexity is only concerned with the length of the shortest algorithm. There are other measures (more rarely used, it seems) that take into account things like memory used, or time (number of steps), though I can’t remember their name just now.
Note that usually the complexity of X is the size of the program that outputs X exactly, not the program that outputs a lot of things including X. Otherwise you can write a quite short program that outputs, say, all possible ascii texts, and claim that it’s size is an upper bound of the complexity of the Bible. Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts.
Example in Python:
This has improved my understanding of the Python “yield” statement.
Glad I could be of use !
Understanding the Power of Yield was a great step forwards for me, afterwards I was horrified to reread some old code that was riddled with horrible classes like DoubleItemIterator and IteratorFilter that my C++-addled brain had cooked up, and realized half my classes were useless and the rest could have there linecount divided by ten.
And some people still cound “lines of code” as a measure of productivity. Sob.
So that’s why my self-enhancing AI keeps getting bogged down!
Voted up for being really funny.
“Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts.”
Or to locate it in “The Library of Babel”.
I actually took information theory but this is more of an issue algorithmic information theory—something I have not studied all that much. Though still, I think you are probably right since Kolgomorov complexity refers to descriptive complexity of an object. And here you can give a much shorter description of all of consecutive natural numbers.
This is very interesting to me because intuitively one would think that both are problems involving infinity and hence I lazily thought that they would both have the same complexity.