It seems like many disagreements ultimately stem from different estimates about the options available. Examples:
If you think “human society, but on Mars” is a realistic option, Elon Musk looks like a visionary. If you think it’s a fabricated option, he looks like a fool (but at least he seems to be having fun).
If you think “industrial society, but without world-destroying levels of fossil fuel use” is a live option, you might be right. But you could be wrong. It could be a fabricated option.
Leftists (stereotypically) view “current levels of wealth, but evenly distributed” is a real option. Conservatives (stereotypically) view self-interest as a major driver of wealth creation, and think that wealth creation without wealth inequality is a fabricated option.
I suspect that most people in this community will be prone to viewing all of the above as live options (rationalists, in my experience, have a strong bias toward optimism (1)). I personally lean in the other direction, but I’ve been wrong before.
(1) Yes, this is technically irony. It stems from a training data problem. The rationalist community’s training data vastly oversamples the tech industry in California between (circa) 1970 and 2010. That time, industry, and place saw the most dramatic technological revolution in history and is in no sense representative of human experience.
That time, industry, and place saw the most dramatic technological revolution in history
From what I understand, the Progress Studies people would strongly disagree with that and say that overall progress has significantly slowed down, and that while the Internet and computer revolution are significant, 100-ish years before that there were multiple such concurrent technological revolutions.
From Jason Crawford’s interview on Vox’s Future Perfect:
I was skeptical about this stagnation idea at first, especially when you look at the amazing progress of computers and the internet, until I started studying progress more broadly. And eventually, I came around.
What convinced me was simply looking at how many different parts of the economy were making progress as rapid as computers and the internet, about 100 years prior. If you take the 1970-through-2020 period where we had computers and the internet, and you compare that to 1870 to 1920, in that period you had an equivalent revolution in communications technology with the telephone and radio.
At the same time, you also had about an equal magnitude revolution in electricity with the electric generator, motor, and light bulb. You had an equivalent kind of revolution in the internal combustion engine and the automobile and the airplane. You had the first synthetic fertilizers with the Haber Bosch process, you had the first plastics, with Bakelite. Plus, that was also the period in which the germ theory was developed and applied in the first chlorinated water systems, and vaccines for new diseases. You have, like, five revolutions all going on at the same time.
Computers and the internet are as big as any one of those revolutions. But as big as all five of them stacked up together? I don’t think you can really make that case.
Moore’s Law had processing power doubling every 18 months to two years for decades; the Atari 2600 of my youth had 128 bytes of RAM; the comparably-priced machine I’m typing this on has 8 billion. No other technology has ever improved by seven orders of magnitude in four decades AFAIK. The economic shifts that came with that made California (and more specifically the Bay Area) what it is today, and my point was that California is highly atypical.
On the other hand, I totally agree with the view that progress has overall slowed down. I think the difference is how you measure; measures that favor IT (e.g. information available) will show very different trends than other measures that may more reasonably reflect the impact of technology on human life (e.g. life expectancy, total energy use, inflation-adjusted mean family income). And even in the tech sector, most places weren’t as changed as California.
I don’t think we have serious disagreements; we’re just describing different parts of the same elephant.
Perhaps. OTOH, even the Atari 2600 was already a consumer-grade mass-market product; gene sequencing is only now getting there.
To be honest, there are a few other times and places where technological progress has been even faster like Japan between 1865 and 1945 or Shenzhen between 1975 and 2020. Nevertheless, such meteoric rises are a vanishingly small part of human history. There are lots of places and industries where the last 40 years have seen only very modest improvements, quite a few where the trend has been to modest decline, and some where the decline has been horrible (e.g. Lebanon, Yemen, Zimbabwe). In my extremely subjective, non-expert opinion, the rationalist community’s expectations for technological progress are reasonable for computer technology (until recently) but are unreasonable compared to recent trends in industries like energy, transportation, agriculture, construction, medicine, and many more. In other words, it has a strong bias toward optimism.
It seems like many disagreements ultimately stem from different estimates about the options available. Examples:
If you think “human society, but on Mars” is a realistic option, Elon Musk looks like a visionary. If you think it’s a fabricated option, he looks like a fool (but at least he seems to be having fun).
If you think “industrial society, but without world-destroying levels of fossil fuel use” is a live option, you might be right. But you could be wrong. It could be a fabricated option.
Leftists (stereotypically) view “current levels of wealth, but evenly distributed” is a real option. Conservatives (stereotypically) view self-interest as a major driver of wealth creation, and think that wealth creation without wealth inequality is a fabricated option.
I suspect that most people in this community will be prone to viewing all of the above as live options (rationalists, in my experience, have a strong bias toward optimism (1)). I personally lean in the other direction, but I’ve been wrong before.
(1) Yes, this is technically irony. It stems from a training data problem. The rationalist community’s training data vastly oversamples the tech industry in California between (circa) 1970 and 2010. That time, industry, and place saw the most dramatic technological revolution in history and is in no sense representative of human experience.
Yes, but some estimates are clearly false, while your examples are estimates that may be true, may be false.
From what I understand, the Progress Studies people would strongly disagree with that and say that overall progress has significantly slowed down, and that while the Internet and computer revolution are significant, 100-ish years before that there were multiple such concurrent technological revolutions.
From Jason Crawford’s interview on Vox’s Future Perfect:
Moore’s Law had processing power doubling every 18 months to two years for decades; the Atari 2600 of my youth had 128 bytes of RAM; the comparably-priced machine I’m typing this on has 8 billion. No other technology has ever improved by seven orders of magnitude in four decades AFAIK. The economic shifts that came with that made California (and more specifically the Bay Area) what it is today, and my point was that California is highly atypical.
On the other hand, I totally agree with the view that progress has overall slowed down. I think the difference is how you measure; measures that favor IT (e.g. information available) will show very different trends than other measures that may more reasonably reflect the impact of technology on human life (e.g. life expectancy, total energy use, inflation-adjusted mean family income). And even in the tech sector, most places weren’t as changed as California.
I don’t think we have serious disagreements; we’re just describing different parts of the same elephant.
There are probably others. Genome sequencing is commonly cited as having been substantially faster, going more orders in fewer decades.
Perhaps. OTOH, even the Atari 2600 was already a consumer-grade mass-market product; gene sequencing is only now getting there.
To be honest, there are a few other times and places where technological progress has been even faster like Japan between 1865 and 1945 or Shenzhen between 1975 and 2020. Nevertheless, such meteoric rises are a vanishingly small part of human history. There are lots of places and industries where the last 40 years have seen only very modest improvements, quite a few where the trend has been to modest decline, and some where the decline has been horrible (e.g. Lebanon, Yemen, Zimbabwe). In my extremely subjective, non-expert opinion, the rationalist community’s expectations for technological progress are reasonable for computer technology (until recently) but are unreasonable compared to recent trends in industries like energy, transportation, agriculture, construction, medicine, and many more. In other words, it has a strong bias toward optimism.