This may, perhaps, be confounded by the phenomenon where I am one of the last living descendants of the lineage that ever knew how to say anything concrete at all. Richard Feynman—or so I would now say in retrospect—is noticing concreteness dying out of the world, and being worried about that, at the point where he goes to a college and hears a professor talking about “essential objects” in class, and Feynman asks “Is a brick an essential object?”—meaning to work up to the notion of the inside of a brick, which can’t be observed because breaking a brick in half just gives you two new exterior surfaces—and everybody in the classroom has a different notion of what it would mean for a brick to be an essential object.
Richard Feynman knew to try plugging in bricks as a special case, but the people in the classroom didn’t, and I think the mental motion has died out of the world even further since Feynman wrote about it. The loss has spread to STEM as well. Though if you don’t read old books and papers and contrast them to new books and papers, you wouldn’t see it, and maybe most of the people who’ll eventually read this will have no idea what I’m talking about because they’ve never seen it any other way...
I find the claim that Eliezer is “one of the last living descendants of the lineage that ever knew how to say anything concrete at all” bizarre, since it seems to me that I observe at least some people around me regularly steering for concreteness in conversations, and in individual thought.
Admittedly it maybe be that they all learned this skill from Eliezer, his writings, derivative works, or backtracking to reading Eliezer’s influences. But, this particular skill doesn’t seem like a very tricky one to learn, compared to many much more subtle or ineffable skills.
I see people steer for it, I know next-to-no people who do it reliably, and especially when talking about “important” or “high status” things or “narratives about their life”. The plumber may talk concretely about his work but then talk magically about the news and politics and his marriage.
I think you should potentially question your own epistemics if they lead you to the conclusion that you and your friends are some of the only competent-at-living-on-the-object-level people in the world, especially when what you’re describing is such an obviously-valuable skill that would be instrumentally useful for basically all real world impact. (If that’s not what you were saying, feel free to ignore this.)
People in your social circles are right about AI risk. Others are wrong. I understand the desire to try to find explanations for that. There are lots of explanations that don’t require beliefs like “we’re better at thinking than everyone else”. For example, you can believe that human civilization incentivizes lots of smart people searching thought-space in different directions, and you happen to have been adjacent to a fruitful vein that others have thus far failed to recognize. Believing instead that you have been successful “because nobody else really thinks on the object level anymore” is going to make it impossible to cooperate with or learn from the other smart serious people who do in fact exist. Whether or not you were planning to participate in that external cooperation, it’s a bad communal norm to be dismissive of potential allies.
Yeah, you’re right, that is what my point boils down to. I think it’s a bad viewpoint to advocate one’s tribe endorse publicly independent of whether one believes it’s true.
Maybe you can consider LW a non-public space, as far as “speaking candid thoughts”, and you’d have better data than me. But for example, I can promise you that if I try to send this post to the average persuadable ML person, they will basically check out when they read something like that. And that’s a real concrete cost, that shouldn’t just be waived away with “but I think it’s true and thus to promote good communication norms I should let that belief be public.”
EDIT: Nvm, I misunderstood the point, I thought the parent comment was arguing that people were good at being concrete, but apparently that was not the point, see followup thread with Ben
Hmm, it seems like the story (to which I am quite sympathetic) is “people are very competent at being concrete in domains where they have tons of feedback from reality, but stop being concrete as soon as you move to a domain in which that’s not the case”.
This story has people being good at the skill when it is actually important for their jobs, so it’s no longer subject to the critique “but this skill is so instrumentally useful that everyone would use it”.
I definitely think Eliezer’s claim is very hyperbolic in its implications[1], but I do think it is pointing at some real phenomenon where many people don’t particularly try to be concrete in domains they don’t have lived experience in.
Though who knows if it is literally false—what does it mean to be in a “lineage”? How many is implied by “one of the last”? I didn’t learn concreteness from Feynman, I can remember using it in random philosophical conversations in high school, long before I knew who Feynman was or what EA / rationality were. Does that mean I wouldn’t count as “one of the last of the lineage”, even if I have the skill?
I find the claim that Eliezer is “one of the last living descendants of the lineage that ever knew how to say anything concrete at all” bizarre, since it seems to me that I observe at least some people around me regularly steering for concreteness in conversations, and in individual thought.
Admittedly it maybe be that they all learned this skill from Eliezer, his writings, derivative works, or backtracking to reading Eliezer’s influences. But, this particular skill doesn’t seem like a very tricky one to learn, compared to many much more subtle or ineffable skills.
I see people steer for it, I know next-to-no people who do it reliably, and especially when talking about “important” or “high status” things or “narratives about their life”. The plumber may talk concretely about his work but then talk magically about the news and politics and his marriage.
What an example of an “important” or “high status” thing that you’ve never observed someone to talk concretely about yet?
idk, I feel like Lightcone team talks about concrete things all the time?
I did think specifically of teammates when I said next-to-none.
I think you should potentially question your own epistemics if they lead you to the conclusion that you and your friends are some of the only competent-at-living-on-the-object-level people in the world, especially when what you’re describing is such an obviously-valuable skill that would be instrumentally useful for basically all real world impact. (If that’s not what you were saying, feel free to ignore this.)
People in your social circles are right about AI risk. Others are wrong. I understand the desire to try to find explanations for that. There are lots of explanations that don’t require beliefs like “we’re better at thinking than everyone else”. For example, you can believe that human civilization incentivizes lots of smart people searching thought-space in different directions, and you happen to have been adjacent to a fruitful vein that others have thus far failed to recognize. Believing instead that you have been successful “because nobody else really thinks on the object level anymore” is going to make it impossible to cooperate with or learn from the other smart serious people who do in fact exist. Whether or not you were planning to participate in that external cooperation, it’s a bad communal norm to be dismissive of potential allies.
I do question my own epistemics? Not sure about your argument regarding why I should, but I do.
Your second paragraph reads to me as “don’t have these beliefs because it would be socially costly”.
Yeah, you’re right, that is what my point boils down to. I think it’s a bad viewpoint to advocate one’s tribe endorse publicly independent of whether one believes it’s true.
Maybe you can consider LW a non-public space, as far as “speaking candid thoughts”, and you’d have better data than me. But for example, I can promise you that if I try to send this post to the average persuadable ML person, they will basically check out when they read something like that. And that’s a real concrete cost, that shouldn’t just be waived away with “but I think it’s true and thus to promote good communication norms I should let that belief be public.”
Oh. I do. Why don’t you?
EDIT: Nvm, I misunderstood the point, I thought the parent comment was arguing that people were good at being concrete, but apparently that was not the point, see followup thread with Ben
Hmm, it seems like the story (to which I am quite sympathetic) is “people are very competent at being concrete in domains where they have tons of feedback from reality, but stop being concrete as soon as you move to a domain in which that’s not the case”.
This story has people being good at the skill when it is actually important for their jobs, so it’s no longer subject to the critique “but this skill is so instrumentally useful that everyone would use it”.
I definitely think Eliezer’s claim is very hyperbolic in its implications[1], but I do think it is pointing at some real phenomenon where many people don’t particularly try to be concrete in domains they don’t have lived experience in.
Though who knows if it is literally false—what does it mean to be in a “lineage”? How many is implied by “one of the last”? I didn’t learn concreteness from Feynman, I can remember using it in random philosophical conversations in high school, long before I knew who Feynman was or what EA / rationality were. Does that mean I wouldn’t count as “one of the last of the lineage”, even if I have the skill?