I appreciate the post and Connor for sharing his views, but the antimeme thing kind of bothers me.
Here’s my hot take: I think Paul and Eliezer were having two totally different conversations. Paul was trying to have a scientific conversation. Eliezer was trying to convey an antimeme.
An antimeme is something that by its very nature resists being known. Most antimemes are just boring—things you forget about. If you tell someone an antimeme, it bounces off them. So they need to be communicated in a special way. Moral intuitions. Truths about yourself. A psychologist doesn’t just tell you “yo, you’re fucked up bro.” That doesn’t work.
A lot of Eliezer’s value as a thinker is that he notices & comprehends antimemes. And he figures out how to communicate them.
A lot of his frustration throughout the years has been him telling everyone that it’s really really hard to convey antimemes. Because it is.
If you read The Sequences, some of it is just factual explanations of things. But a lot of it is metaphor. It reads like a religious text. Not because it’s a text of worship, but because it’s about metaphors and stories that affect you more deeply than facts.
What happened in the MIRI dialogues is that Eliezer was telling Paul “hey, I’m trying to communicate an antimeme to you, but I’m failing because it’s really really hard.”
Does Connor ever say what antimeme Eliezer is trying to convey, or is it so antimemetic that no one can remember it long enough to write it down?
I understand that if this antimeme stuff is actually true, these ideas will be hard to convey. But it’s really frustrating to hear Connor keep talking about antimemes while not actually mentioning what these antimemes are and what makes them antimemetic. Also, saying “There are all these antimemes out there but I can’t convey them to you” is a frustratingly unfalsifiable statement.
I suppose if it’s an a antimeme, I may be not understanding. But this was my understanding:
Most humans are really bad at being strict consequentialists. In this case, they think of some crazy scheme to slow down capabilities that seems sufficiently hardcore to signal that they are TAKING SHIT SERIOUSLY and ignore second order effects that EY/Connor consider obvious. Anyone whose consequentialism has taken them to this place is not a competent one. EY proposes such people (which I think he takes to mean everyone, possibly even including himself) follow a deontological rule instead, attempt to die with dignity. Connor analogizes this to reward shaping—the practice of assigning partial credit to RL agents for actions likely to be useful in reaching the true goal.
I think that’s the antimeme from the Dying with Dignity post. If I remember correctly, the MIRI dialogues between Paul and Eliezer were about takeoff speeds, so Connor is probably referring to something else in the section I quoted, no?
I appreciate the post and Connor for sharing his views, but the antimeme thing kind of bothers me.
Does Connor ever say what antimeme Eliezer is trying to convey, or is it so antimemetic that no one can remember it long enough to write it down?
I understand that if this antimeme stuff is actually true, these ideas will be hard to convey. But it’s really frustrating to hear Connor keep talking about antimemes while not actually mentioning what these antimemes are and what makes them antimemetic. Also, saying “There are all these antimemes out there but I can’t convey them to you” is a frustratingly unfalsifiable statement.
I suppose if it’s an a antimeme, I may be not understanding. But this was my understanding:
Most humans are really bad at being strict consequentialists. In this case, they think of some crazy scheme to slow down capabilities that seems sufficiently hardcore to signal that they are TAKING SHIT SERIOUSLY and ignore second order effects that EY/Connor consider obvious. Anyone whose consequentialism has taken them to this place is not a competent one. EY proposes such people (which I think he takes to mean everyone, possibly even including himself) follow a deontological rule instead, attempt to die with dignity. Connor analogizes this to reward shaping—the practice of assigning partial credit to RL agents for actions likely to be useful in reaching the true goal.
I think that’s the antimeme from the Dying with Dignity post. If I remember correctly, the MIRI dialogues between Paul and Eliezer were about takeoff speeds, so Connor is probably referring to something else in the section I quoted, no?