This post, to me, is a textbook example of the conjunctive fallacy (the so-called “Linda problem” from Thinking Fast and Slow). It plays on people’s desire to believe narratives, so by constructing a detailed narrative of how a single particular future would operate, you’ve made that particular future seem more likely than all the rest.
In reality, any future transportation scenario, whether it contains self-driving cars or not, whether it contains public transit or not, will be equally plausible and inevitable-seeming after it has come about. Your narrative does not show why it is any more likely or plausible than all of the other detailed narratives that could lead to other scenarios.
More specifically, I read things like
If our markets did not completely suffice this expropriation of commons resources, the people would have to run in torch-and-pitchfork and liberate the cars themselves. It is both evil and stupid to keep these noble service robots pent in a garage.
Or
The code will be public, of course, as the autonomous network is a public resource, and cars from competing manufacturers would like to be able to coordinate.
and “bullshit alarms” immediately trip in my mind. People hoard scarce resources today; why doesn’t the public run in and liberate them? Critical infrastructure is already computerized (and, even more horrifying, sometimes already hooked up to the public Internet) without its code being open source.
Until you can explain how we get from our present transportation equilibrium to your future utopian vision (beyond handwaving it away by saying, “Economic forces will take care of it.”) your vision remains nothing more than a castle in the sky.
I’m not actually seeing why this post is purely an instance of conjunctive fallacy. A lot of the details he describes are consequences of cars being autonomous or indirect effects of this. And that’s not to say there are no errors here, just that I don’t think it’s merely a list of statements A,B,C,etc with no causal relationship.
This post, to me, is a textbook example of the conjunctive fallacy (the so-called “Linda problem” from Thinking Fast and Slow). It plays on people’s desire to believe narratives, so by constructing a detailed narrative of how a single particular future would operate, you’ve made that particular future seem more likely than all the rest.
In reality, any future transportation scenario, whether it contains self-driving cars or not, whether it contains public transit or not, will be equally plausible and inevitable-seeming after it has come about. Your narrative does not show why it is any more likely or plausible than all of the other detailed narratives that could lead to other scenarios.
More specifically, I read things like
Or
and “bullshit alarms” immediately trip in my mind. People hoard scarce resources today; why doesn’t the public run in and liberate them? Critical infrastructure is already computerized (and, even more horrifying, sometimes already hooked up to the public Internet) without its code being open source.
Until you can explain how we get from our present transportation equilibrium to your future utopian vision (beyond handwaving it away by saying, “Economic forces will take care of it.”) your vision remains nothing more than a castle in the sky.
I’m not actually seeing why this post is purely an instance of conjunctive fallacy. A lot of the details he describes are consequences of cars being autonomous or indirect effects of this. And that’s not to say there are no errors here, just that I don’t think it’s merely a list of statements A,B,C,etc with no causal relationship.