But what makes you so confident that it’s not possible for subject-matter experts to have correct intuitions that outpace their ability to articulate legible explanations to others?
Yepp, this is a judgement call. I don’t have any hard and fast rules for how much you should expect experts’ intuitions to plausibly outpace their ability to explain things. A few things which inform my opinion here:
Explaining things to other experts should be much easier than explaining them to the public.
Explaining things to other experts should be much easier than actually persuading those experts.
It’s much more likely that someone has correct intuitions if they have a clear sense of what evidence would make their intuitions stronger.
I don’t think Eliezer is doing particularly well on any of these criteria. In particular, the last one was why I pressed Eliezer to make predictions rather than postdictions in my debate with him. The extent to which Eliezer seemed confused that I cared about this was a noticeable update for me in the direction of believing that Eliezer’s intuitions are less solid than he thinks.
It may be the case that Eliezer has strong object-level intuitions about the details of how intelligence works which he’s not willing to share publicly, but which significantly increase his confidence in his public claims. If so, I think the onus is on him to highlight that so people can make a meta-level update on it.
Yepp, this is a judgement call. I don’t have any hard and fast rules for how much you should expect experts’ intuitions to plausibly outpace their ability to explain things. A few things which inform my opinion here:
Explaining things to other experts should be much easier than explaining them to the public.
Explaining things to other experts should be much easier than actually persuading those experts.
It’s much more likely that someone has correct intuitions if they have a clear sense of what evidence would make their intuitions stronger.
I don’t think Eliezer is doing particularly well on any of these criteria. In particular, the last one was why I pressed Eliezer to make predictions rather than postdictions in my debate with him. The extent to which Eliezer seemed confused that I cared about this was a noticeable update for me in the direction of believing that Eliezer’s intuitions are less solid than he thinks.
It may be the case that Eliezer has strong object-level intuitions about the details of how intelligence works which he’s not willing to share publicly, but which significantly increase his confidence in his public claims. If so, I think the onus is on him to highlight that so people can make a meta-level update on it.