I may be wrong, but don’t all distributed systems suffer from diminishing returns in this way ? For example, doubling the number of CPUs in a computing cluster does not allow you to solve your calculations twice as quickly. Your overhead, such as control infrastructure and plain old network latency, increases faster than linearly with every CPU you add, and eventually outgrows the useful processing power you can get out of new CPUs.
Asynchronous computers could easily grow to a planetary scale. Parallel computing rarely gets linear scalability—but it doesn’t necessarily flatten off quickly at small sizes, either.
Asynchronous computers could easily grow to a planetary scale. Parallel computing rarely gets linear scalability—but it doesn’t necessarily flatten off quickly at small sizes, either.