I guess a deficit in quantitative reasoning is just one of the contributing factors.
Another contributing part, I keep thinking about a lot, is the role of social media during the pandemic. Social media is making money by engaging people. The longer people are on your platform, the more data you can harvest and the more advertisements you can show them, resulting in more revenues. And the more data you have, the better you can target the ads, and so on. The best way to drive up engagement is to promote controversial posts (the more extreme the better, you like it and share it or you don’t like it and talk about it). This leads to filter bubbles. By knowing the main orientation of those filter bubbles it is easy to drive up engagement by showing each filter bubble some posts that are aligned to their views (maybe even increasing to more extreme topics and standpoints).
Of course this is not beneficial for the society as a whole and it drives division and is not improving a culture of open discussion, but it is currently a great and more or less unregulated money making machine.
Pair that with a very capitalistic society without a lot of social security nets and a situation that brings people to the edge (i.e., pandemic) and the outlined mechanics from above is running even faster/better (isolated people, increase of fear of the unknown, mental health issues, etc.).
(And hey, you can even use this technology (unofficially) to harm other parties and cause a lot of damage with a fraction of the cost of traditional operations.)
And in the end, the outlined aspect comes down to misaligned incentives.
(Note: Maybe I was reading recently too much about misinformation using natural language processing.)
I guess a deficit in quantitative reasoning is just one of the contributing factors.
Another contributing part, I keep thinking about a lot, is the role of social media during the pandemic. Social media is making money by engaging people. The longer people are on your platform, the more data you can harvest and the more advertisements you can show them, resulting in more revenues. And the more data you have, the better you can target the ads, and so on. The best way to drive up engagement is to promote controversial posts (the more extreme the better, you like it and share it or you don’t like it and talk about it). This leads to filter bubbles. By knowing the main orientation of those filter bubbles it is easy to drive up engagement by showing each filter bubble some posts that are aligned to their views (maybe even increasing to more extreme topics and standpoints).
Of course this is not beneficial for the society as a whole and it drives division and is not improving a culture of open discussion, but it is currently a great and more or less unregulated money making machine.
Pair that with a very capitalistic society without a lot of social security nets and a situation that brings people to the edge (i.e., pandemic) and the outlined mechanics from above is running even faster/better (isolated people, increase of fear of the unknown, mental health issues, etc.).
(And hey, you can even use this technology (unofficially) to harm other parties and cause a lot of damage with a fraction of the cost of traditional operations.)
And in the end, the outlined aspect comes down to misaligned incentives.
(Note: Maybe I was reading recently too much about misinformation using natural language processing.)