Right. What’s disturbing is that people who don’t share these biases don’t respond with estimates of their own. They respond with “too negligible to matter”.
So, what would be a rational way to update based on both the detailed numbers provided by sources biased toward believing that overpopulation is a threat and on vague numbers provided by sources biased against believing that overpopulation is a threat?
What do you think the nature of each of these biases might be? Perhaps that might shed some light on how to correct for them.
By the way, how is this any different from half a century of predictions that AI is just around the corner?
Right. What’s disturbing is that people who don’t share these biases don’t respond with estimates of their own. They respond with “too negligible to matter”.
So, what would be a rational way to update based on both the detailed numbers provided by sources biased toward believing that overpopulation is a threat and on vague numbers provided by sources biased against believing that overpopulation is a threat?
What do you think the nature of each of these biases might be? Perhaps that might shed some light on how to correct for them.
By the way, how is this any different from half a century of predictions that AI is just around the corner?