The specific scenario I talk about in the paragraph you’re responding too is one where everything except for the sense of deepness is the same for both ideas, such that someone who doesn’t have a sense of what ideas are deep or profound would find the ideas basically equivalent.
But if that’s not what the distribution looks like, but rather the distribution looks like a strong correlation, then it’s not a bias, it’s just following what the distribution says. Maybe to shore up / expand on your argument, you’re talking about the optimizer’s curse: https://www.lesswrong.com/posts/5gQLrJr2yhPzMCcni/the-optimizer-s-curse-and-how-to-beat-it
So like, the most deep-seeming idea will tend to regress to the mean more than a random idea would regress. But this doesn’t argue to not pay attention to things that seem deep. (It argues for a portfolio approach, but there’s lots of arguments for a portfolio approach.)
Maybe another intuition you’re drawing on is information cascades. If there’s a lot of information cascades, then a lot of people are paying attention to a few very deep-seeming ideas. Which we can agree is dumb.
I think on the margin new theoretical alignment researchers should do less of this, as I think most deep sounding ideas just genuinely aren’t very productive to research and aren’t amenable to being proven to be unproductive to work on—often times the only evidence that a deep idea isn’t productive to work on is that nothing concrete has come of it yet.
I think this is pretty wrong, though it seems hard to resolve. I would guess that a lot of things that are later concretely productive started with someone hearing something that struck them as deep, and then chewing on it and transforming it.
But if that’s not what the distribution looks like, but rather the distribution looks like a strong correlation, then it’s not a bias, it’s just following what the distribution says. Maybe to shore up / expand on your argument, you’re talking about the optimizer’s curse: https://www.lesswrong.com/posts/5gQLrJr2yhPzMCcni/the-optimizer-s-curse-and-how-to-beat-it So like, the most deep-seeming idea will tend to regress to the mean more than a random idea would regress. But this doesn’t argue to not pay attention to things that seem deep. (It argues for a portfolio approach, but there’s lots of arguments for a portfolio approach.)
Maybe another intuition you’re drawing on is information cascades. If there’s a lot of information cascades, then a lot of people are paying attention to a few very deep-seeming ideas. Which we can agree is dumb.
I think this is pretty wrong, though it seems hard to resolve. I would guess that a lot of things that are later concretely productive started with someone hearing something that struck them as deep, and then chewing on it and transforming it.