No, but I think you’re misunderstanding Eliezer. Let me explain.
When I ask myself “Should I be dishonest at all in a particular situation?” I have pretty similar standards for lots of domains. The primary reason to ask is when there’s genuine questions to ask about whether an extremely powerful force is attempting to extract a specific lie from me, or whether an extremely powerful immoral force is leaving me no control over what it does except via deception. For domains where this is not the case, I want to speak plainly and honestly.
When I list domains and ask how honest one ought to be in them (things like being honest about your work history to the government, honest about your relationship history to prospective partners, honest about your criminal record to anyone, honest about how your work is going to your boss, honest in conversations about your honesty to anyone, and so on), the standard is to be truthful except in a small number of situations where incredibly powerful entities or forces have broken the game board badly enough that the moral thing to do is to lie.
I say this because I don’t think that being honest about your honesty is fundamentally different than being honest about other things, for all of them there’s a standard of no-lying, and an extremely high bar for an powerful entity to be threatening you and everything you care about for you to have to lie.
Eliezer writes this reasoning about honesty:
And I think it’s reasonable to expect that over the course of a human lifetime you will literally never end up in a situation where a Gestapo officer who has read this essay is pointing a gun at you and asking overly-object-level-probing meta-honesty questions, and will shoot you if you try to glomarize but will believe you if you lie outright, given that we all know that everyone, innocent or guilty, is supposed to glomarize in situations like that. Up until today I don’t think I’ve ever seen any questions like this being asked in real life at all, even hanging out with a number of people who are heavily into recursion.
So if one is declaring the meta-honesty code at all, then one shouldn’t meta-lie, period; I think the rules have been set up to allow that to be absolute.
I don’t believe that Eliezer applies different standards of honesty to normal situations and to meta-sentences about honesty. I think he applies the same standards, and finds that you are more under threat on the object level than you are on the (explicitly-discussed) meta level.
Eliezer is very explicit and repeats many times in that essay, including in the very segment you quote, that his code of meta-honesty does in fact compel you to never lie in a meta-honesty discussion. The first 4 paragraphs of your comment are not elaborating with what Eliezer really meant, they are disagreeing with him. Reasonable disagreements too, in my opinion, but conflating them with Eliezer’s proposal is corrosive to the norms that allows people to propose and test new norms.
Re-reading the post, I see I was mistaken. Eliezer is undeniably proposing an absolute rule on the meta-level, not one where dishonesty should be “held to an extremely high bar” as I discussed.
I’ll try to compress the difference between our proposals: I was proposing “Be highly honest, and be consistent when you talk about it on the meta-level”, whereas Eliezer is proposing “Be highly honest, and be absolutely honest when you talk about it on the meta-level”. The part I quoted was his consequentialist argument that the absolute rule would not be that costly, not a consequentialist account of when to be honest on the meta-level.
No, but I think you’re misunderstanding Eliezer. Let me explain.
When I ask myself “Should I be dishonest at all in a particular situation?” I have pretty similar standards for lots of domains. The primary reason to ask is when there’s genuine questions to ask about whether an extremely powerful force is attempting to extract a specific lie from me, or whether an extremely powerful immoral force is leaving me no control over what it does except via deception. For domains where this is not the case, I want to speak plainly and honestly.
When I list domains and ask how honest one ought to be in them (things like being honest about your work history to the government, honest about your relationship history to prospective partners, honest about your criminal record to anyone, honest about how your work is going to your boss, honest in conversations about your honesty to anyone, and so on), the standard is to be truthful except in a small number of situations where incredibly powerful entities or forces have broken the game board badly enough that the moral thing to do is to lie.
I say this because I don’t think that being honest about your honesty is fundamentally different than being honest about other things, for all of them there’s a standard of no-lying, and an extremely high bar for an powerful entity to be threatening you and everything you care about for you to have to lie.
Eliezer writes this reasoning about honesty:
I don’t believe that Eliezer applies different standards of honesty to normal situations and to meta-sentences about honesty. I think he applies the same standards, and finds that you are more under threat on the object level than you are on the (explicitly-discussed) meta level.
Eliezer is very explicit and repeats many times in that essay, including in the very segment you quote, that his code of meta-honesty does in fact compel you to never lie in a meta-honesty discussion. The first 4 paragraphs of your comment are not elaborating with what Eliezer really meant, they are disagreeing with him. Reasonable disagreements too, in my opinion, but conflating them with Eliezer’s proposal is corrosive to the norms that allows people to propose and test new norms.
Re-reading the post, I see I was mistaken. Eliezer is undeniably proposing an absolute rule on the meta-level, not one where dishonesty should be “held to an extremely high bar” as I discussed.
I’ll try to compress the difference between our proposals: I was proposing “Be highly honest, and be consistent when you talk about it on the meta-level”, whereas Eliezer is proposing “Be highly honest, and be absolutely honest when you talk about it on the meta-level”. The part I quoted was his consequentialist argument that the absolute rule would not be that costly, not a consequentialist account of when to be honest on the meta-level.