You seem to assume that you understand the motivations of the involved individuals based on a post that criticises them.
That’s a bad strategy.
If you don’t understand someone’s actions based on such analysis, most often it’s a bad idea to conclude that they are stupid. It’s more likely that you simply don’t have relevant understanding of their positions.
Not everyone believes in global warming. If you want to understand other people you have to make it part of your model that they might not believe in it, even if you think the model for global warming is overwhelming.
An executive of General Motors might think that it’s bad if General Motors goes bankrupt and has to lay of thousands of workers in Detroit.
As a result GM fights against US government regulations that it thinks will reduce it’s bottom line.
You seem to assume that you understand the motivations of the involved individuals based on a post that criticises them. That’s a bad strategy.
I agree that the bias of the author needs to be taken into account. But a) I’ve heard this stuff many times before, and b) I see this author as particularly trustworthy. If it were any old author making a claim that I had never heard before, I’d definitely agree with you, but that isn’t the case here. That said, I’m no expert/insider in this field, and so I’m not too confident.
If you don’t understand someone’s actions based on such analysis, most often it’s a bad idea to conclude that they are stupid. It’s more likely that you simply don’t have relevant understanding of their positions.
1) Maybe “stupid” is a bad word. Maybe I should have said “not instrumentally rational”.
2) “don’t understand” seems to be the key phrase here. It’s true that I don’t have perfect understanding, but given my limited understanding, my impression is still that it’s definitely more likely than not that they’re not acting instrumentally rational. I’m sure you agree with me that in theory, this could be true (the idea that you could be >50% sure that someone is stupid based off of limited understanding). As for the specific point in this post, it’s definitely debatable.
Not everyone believes in global warming. If you want to understand other people you have to make it part of your model that they might not believe in it, even if you think the model for global warming is overwhelming.
An executive of General Motors might think that it’s bad if General Motors goes bankrupt and has to lay of thousands of workers in Detroit. As a result GM fights against US government regulations that it thinks will reduce it’s bottom line.
I hadn’t even thought about these two things, so thanks! Great points. I forgot how frequently humans bring themselves to believe things like this.
I agree that the bias of the author needs to be taken into account.
I have not said anything directly about bias of the author in the sentence you quote.
But a) I’ve heard this stuff many times before, and b) I see this author as particularly trustworthy
Did the author interview a single executive of a car company to come to the conclusion that it’s greed that drives the decision against EV’s?
I’m sure you agree with me that in theory, this could be true (the idea that you could be >50% sure that someone is stupid based off of limited understanding).
If you see that the actions of another person don’t make sense based on the motivations you can see from the outside, you have two options:
1) Assume that there are motivations that you don’t see.
2) Assume that they are irrational.
Given the amount of knowledge you have in this case, you can’t assume that 2) is more likely.
You seem to assume that you understand the motivations of the involved individuals based on a post that criticises them. That’s a bad strategy.
If you don’t understand someone’s actions based on such analysis, most often it’s a bad idea to conclude that they are stupid. It’s more likely that you simply don’t have relevant understanding of their positions.
Not everyone believes in global warming. If you want to understand other people you have to make it part of your model that they might not believe in it, even if you think the model for global warming is overwhelming.
An executive of General Motors might think that it’s bad if General Motors goes bankrupt and has to lay of thousands of workers in Detroit. As a result GM fights against US government regulations that it thinks will reduce it’s bottom line.
I agree that the bias of the author needs to be taken into account. But a) I’ve heard this stuff many times before, and b) I see this author as particularly trustworthy. If it were any old author making a claim that I had never heard before, I’d definitely agree with you, but that isn’t the case here. That said, I’m no expert/insider in this field, and so I’m not too confident.
1) Maybe “stupid” is a bad word. Maybe I should have said “not instrumentally rational”.
2) “don’t understand” seems to be the key phrase here. It’s true that I don’t have perfect understanding, but given my limited understanding, my impression is still that it’s definitely more likely than not that they’re not acting instrumentally rational. I’m sure you agree with me that in theory, this could be true (the idea that you could be >50% sure that someone is stupid based off of limited understanding). As for the specific point in this post, it’s definitely debatable.
I hadn’t even thought about these two things, so thanks! Great points. I forgot how frequently humans bring themselves to believe things like this.
I have not said anything directly about bias of the author in the sentence you quote.
Did the author interview a single executive of a car company to come to the conclusion that it’s greed that drives the decision against EV’s?
If you see that the actions of another person don’t make sense based on the motivations you can see from the outside, you have two options: 1) Assume that there are motivations that you don’t see. 2) Assume that they are irrational.
Given the amount of knowledge you have in this case, you can’t assume that 2) is more likely.