(Originally sent as a PM, but I think it’s worth saying in public.)
Good work, first of all. I think you might still be a few inferential leaps past many plausible readers, though. For instance, many people don’t actually know that it’s physically possible to run a WBE a million times faster than a brain, nor that there’s a lot we know is possible to program but can’t do yet.
You need to point out that nerve impulses are much slower than semiconductor logic gates, and that most of the reason the brain is better at many tasks is because 50 years of programming hasn’t yet caught up to several hundred million years of evolution on things like vision processing. Concepts like those might be helpful for the non-LW readers.
But I don’t mean to criticize too much, because what you’ve done is pretty excellent!
I think people understand that their calculator can do arithmetic much faster than they can. No?
Yes, but there’s a leap from there to the idea a computer might be able to run the equivalent of neurons faster than a person. It might need to be stated explicitly for people who aren’t use to thinking about these issues.
This could simply be an indication that the brain’s architecture is not well-optimized for arithmetic. It doesn’t necessarily imply that calculators are faster.
The computer I had in 1999 had occasional difficulties in carrying out real-time emulation of a gaming console released in 1990. That doesn’t mean the console had better hardware.
I don’t think that many people consciously connect the two ideas- again, we’re talking about a short but essential inferential leap. (I’ve known some very smart people who were surprised when I pointed out this particular fact, by the way.)
(Originally sent as a PM, but I think it’s worth saying in public.)
Good work, first of all. I think you might still be a few inferential leaps past many plausible readers, though. For instance, many people don’t actually know that it’s physically possible to run a WBE a million times faster than a brain, nor that there’s a lot we know is possible to program but can’t do yet.
You need to point out that nerve impulses are much slower than semiconductor logic gates, and that most of the reason the brain is better at many tasks is because 50 years of programming hasn’t yet caught up to several hundred million years of evolution on things like vision processing. Concepts like those might be helpful for the non-LW readers.
But I don’t mean to criticize too much, because what you’ve done is pretty excellent!
I think people understand that their calculator can do arithmetic much faster than they can. No?
Yes, but there’s a leap from there to the idea a computer might be able to run the equivalent of neurons faster than a person. It might need to be stated explicitly for people who aren’t use to thinking about these issues.
Okay. I added a parenthetical: “computer circuits communicate much faster than neurons do”.
This could simply be an indication that the brain’s architecture is not well-optimized for arithmetic. It doesn’t necessarily imply that calculators are faster.
The computer I had in 1999 had occasional difficulties in carrying out real-time emulation of a gaming console released in 1990. That doesn’t mean the console had better hardware.
I don’t think that many people consciously connect the two ideas- again, we’re talking about a short but essential inferential leap. (I’ve known some very smart people who were surprised when I pointed out this particular fact, by the way.)