How was it determined that “there are not enough resources in the universe to generate all possible books of 200 pages.”? Would love to know the math behind this.
The typical book page contains about 3k chars (including spaces).
If we encode each char in UTF-8 (8 bit), a 200-pages book will contain about 5*10^6 bits (about 600 KiB).
Thus, to generate all 200-pages books, we must generate all binary numbers of the length 5*10^6 .
There are 2^(5*10^6) such numbers, or about 10^10^6.
There are at most 10^82 atoms in the universe. If we replace each atom with a (classical) computer, and split the work among them, each of the computers will need to generate 10^(10^6 − 82) books.
If each computer is generating one example per femtosecond (10^-15 sec), it will take them roughly 10^10^6 sec to finish the job.
It’s much much longer than the time before the last black hole evaporates (10^114 sec).
And this is only the generation. We also need to write the books down somewhere, which will require some additional time per book, and a hell lot of storage.
I suspect that the entire task can be done dramatically faster on quantum computers. But I’m not knowledgeable enough in the topic to predict the speedup. Can they do it in 1 sec? In 10^10^3 sec? No idea.
We could also massively speed up the whole thing if we limit it to realistic books, and not just strings of random characters. E.g. use only the relevant parts of UTF-8.
The estimate is based on my (ridiculously primitive) 20th-century understanding of computing. How will people think about such tasks in 1000 years is beyond my comprehension (as it was beyond comprehension 70 years ago to measure stuff in petabytes and petaFLOPS).
How was it determined that “there are not enough resources in the universe to generate all possible books of 200 pages.”? Would love to know the math behind this.
My reasoning was as follows.
The typical book page contains about 3k chars (including spaces).
If we encode each char in UTF-8 (8 bit), a 200-pages book will contain about 5*10^6 bits (about 600 KiB).
Thus, to generate all 200-pages books, we must generate all binary numbers of the length 5*10^6 .
There are 2^(5*10^6) such numbers, or about 10^10^6.
There are at most 10^82 atoms in the universe. If we replace each atom with a (classical) computer, and split the work among them, each of the computers will need to generate 10^(10^6 − 82) books.
If each computer is generating one example per femtosecond (10^-15 sec), it will take them roughly 10^10^6 sec to finish the job.
It’s much much longer than the time before the last black hole evaporates (10^114 sec).
And this is only the generation. We also need to write the books down somewhere, which will require some additional time per book, and a hell lot of storage.
I suspect that the entire task can be done dramatically faster on quantum computers. But I’m not knowledgeable enough in the topic to predict the speedup. Can they do it in 1 sec? In 10^10^3 sec? No idea.
We could also massively speed up the whole thing if we limit it to realistic books, and not just strings of random characters. E.g. use only the relevant parts of UTF-8.
The estimate is based on my (ridiculously primitive) 20th-century understanding of computing. How will people think about such tasks in 1000 years is beyond my comprehension (as it was beyond comprehension 70 years ago to measure stuff in petabytes and petaFLOPS).