That’s why I said ‘self deluded’, rather than just ‘deluded’. There is a big difference between believing something incorrect that’s believed by default, and coming up yourself with a very convenient incorrect belief that makes you feel good and pays the bills, and then actively working to avoid any challenges to this belief. Honest people are those who put such beliefs to good scrutiny (not just talk about putting such beliefs to scrutiny).
The honesty is elusive matter, when the belief works like that dragon in the garage. When you are lying, you have to deceive computational processes that are roughly your equals. That excludes all straightforward approaches to lying, such as waking up in the morning and thinking ‘how can i be really bad and evil today?’. Lying is a complicated process, with many shortcuts when it comes to the truth. I define lying as successful generation of convincing untruths—a black box definition without getting into details with regards to what parts of the cortex are processing the truth and what are processing the falsehoods. (I exclude the inconsistent accidental generation of such untruths by mistake, unless the mistakes are being chosen)
Hrm? If Newton and Kepler were deluded by mysticism, they were self-deluded. They weren’t toeing a party line and they weren’t echoing conventional views. They sat down and thought hard and came up with beliefs that seem pretty nuts to us.
I see that you want to label it as “not honest” if they don’t put those beliefs to good scrutiny. I think you are using “honest” in a non-standard (and possibly circular) way here. We can’t easily tell from the outside how much care they invested in forming those beliefs, or how self-deluded they are. All we can tell is whether the belief, in retrospect, seems to have been plausible given the evidence available at the time. If you want to label it as “not honest” when it seems wacky to us, then yes, tautologically honest people don’t come to have wacky beliefs.
The impression I have is that N and K (and many scientists since) weren’t into numerology or mysticism to impress their peers or to receive external benefits: they really did believe, based on internal factors.
See, what we have here is a belief cluster that makes the belief-generator feel very good (saving the world, the other smart people are less smart, etc etc) and pays his bills. That is awfully convenient for a reasoning error. Not saying that it is entirely impossible to have a serendipitously useful reasoning error, but doesn’t seem likely.
edit: note, I’m not speaking about some inconsequential honesty in idle thought, or anything likewise philosophical. I’m speaking of not exploiting others for money. There’s nothing circular about the notion that honest person would not be talking a friend into paying him upfront to fix the car when that honest person does not have any discernible objective reason what so ever to think that he could fix the car, and a dishonest person would talk friend into paying. Now, if we were speaking of a very secretive person that doesn’t like to talk of himself, there would’ve been probability of some big list of impressive accomplishments we haven’t heard of...
It’s possible we are just using terms differently. I agree that people are biased by their self-interest. I just don’t think that bias is a form of dishonesty. It’s a very easy mistake to make, and nearly impossible to prevent.
I don’t think SIAI is unusually guilty of this or unusually dishonest.
In science, everybody understands that researchers are biased to believing their own results and to believing new results that make their existing work more important. Most professional scientists are routinely in the position of explaining to funding agencies why their work is extremely important and needs lots of government grant dollars. Everybody, not just SIAI, has to talk donors into funding their Very Important OWrk.
For government grants and prestigious publications, we try to mitigate the bias by having expert reviewers. We also tolerate a certain amount of slop. SIAI is cutting out the government and trying to convince the public, directly, to fund their work. It’s an unusual strategy, but I don’t see that it’s dishonest or immoral or even necessarily unwise.
You are declaring everything gray here so that verbally everything is equal.
There are people with no knowledge in physics and no inventions to their name, whose first ‘invention’ is a perpetual motion device. You really don’t see anything dishonest about holding an unfounded belief that you’re this smart? You really see nothing dishonest about accepting money under this premise without doing due diligence such as trying yourself at something testable, even if you think you’re this smart?
There are scientists whom are trying very hard to follow processes that are not prone to error, people trying to come up with ways to test their beliefs, do you really see them as all equal in the level of dishonesty?
There are people whom are honestly trying to make a perpetual motion device, whom sink their money into it, and never produce anything that they can show to investors, because they are honest and don’t use hidden wires etc. (The equivalent would have Eliezer moving out to a country with a very cheap living, canceling his cryonics subscription, and so on, to maximize the money available for doing the very important work in question)
You can talk all day in qualitative terms how it is the same, state unimportant difference as the only one, and assert that you ‘don’t see the moral difference’, but this ‘counter argument’ you’re making is entirely generic and equally applicable to any form of immoral or criminal conduct. A court wouldn’t be the least bit impressed.
Also, I don’t go philosophical. I don’t care what’s going on inside the head unless i’m interested in neurology. I know that the conduct is dishonest, and the beliefs under which the honest agent would have such conduct lack foundation, there isn’t some honest error here that did result in belief that leads honest agent to adopt such conduct. The convincing liars don’t seem to work by thinking ‘how could I lie’, they just adopt the convenient falsehood as high priority axiom for talk and as low priority axiom for walk, as to resolve contradictions in most useful way, and that makes it very very murky as to what they actually believe.
You can say that it is honest to act on a belief, but that’s an old idea, and nowadays things are more sophisticated and it is a get out of jail free card for almost all liars, whom first make up a very convenient, self serving false belief with not a trace of honesty to the belief making process, and then act on it.
Your hypothetical is a good one. And you are correct: I don’t think you are dishonest if you are sincerely trying to build or sell a perpetual motion machine. You’re still wrong, and even silly, but not dishonest. I need a word to refer to conscious knowing deception, and “dishonest” is the most useful word for the purpose. I can’t let you use it for some other purpose; I need it where it is.
The argument is not applicable to all criminal conduct. In American criminal law, we pay a lot of attention to the criminal’s state of mind. Having the appropriate criminal state of mind is an essential element of many crimes. It’s not premeditated murder if you didn’t expect the victim to die. It’s not burglary if you thought you lived there. It’s utterly routine—and I think morally necessary—to ask juries “what what the defendant’s intention or state of mind”. There is a huge moral and practical difference between a conscious and an unconscious criminal. Education much more easily cures the latter, while punishment is comparatively ineffective. For the conscious criminal, the two are reversed: punishment is often appropriate, whereas education has limited benefits.
I don’t believe I am giving liars a get-out-of-jail-free card. Ignorance isn’t an unlimited defense, and I don’t think it is so easy to convince an outside observer (or a jury) that you’re ignorant in cases where knowledge would be expected. If you really truly are in a state of pathological ignorance and it’s a danger to others, we might lock you up as a precaution, but you wouldn’t be criminally liable.
As to scientific ethics: All human processes have a non-zero chance of errors. The scientists I know are pretty cynical about the process. They are fighting to get papers published and they know it. But they do play by the rules—they won’t falsify data or mislead the reader. And they don’t want to publish something if they’ll be caught-out having gotten something badly wrong. As a result, the process works pretty much OK. It moves forward on average.
I think SIAI is playing by similar rules. I’ve never seen them caught lying about some fact that can be reliably measured. I’ve never seen evidence they are consciously deceiving their audience. If they submitted their stuff to a scientific publication and I were the reviewer, I might try to reject the paper, but I wouldn’t think of trying to have them disciplined for submitting it. In science, we don’t accuse people of misconduct for being wrong, or pigheaded, or even for being overly biased by your self-interest. Is there some more serious charge you can bring against SIAI? How are they worse than any scientist fighting for a grant based on shakey evidence?
I think you have somewhat simplistic idea of justice… there is the “voluntary manslaughter”, there’s the “gross negligence”, and so on. I think SIAI falls under the latter category.
How are they worse than any scientist fighting for a grant based on shakey evidence?
Quantitatively, and by a huge amount. edit: Also, the of beliefs, that they claim to hold, when hold honestly, result in massive loss of resources such as moving to cheaper country to save money, etc etc. I dread to imagine what would happen to me if I honestly were this mistaken about AI. The erroneous beliefs damage you.
The lying is about having two sets of incompatible beliefs, that are picked between based on convenience.
edit: To clarify, the justice is not about the beliefs held by the person. It is more about the process that the person is using to arriving at the actions (see the whole ‘reasonable person’ stuff). If A wants to kill B and A edits A’s beliefs to be “B is going to kill me”, and then acts in self defense and kills B, if the justice system had a log of A’s processing, the A would go for premeditated murder. Even though at the time of the murder A is honestly acting in self defense. (Furthermore, lacking a cross neurophysiological anomaly, it is a fact of reality that the justice can only act based on inputs and outputs of agents)
That’s why I said ‘self deluded’, rather than just ‘deluded’. There is a big difference between believing something incorrect that’s believed by default, and coming up yourself with a very convenient incorrect belief that makes you feel good and pays the bills, and then actively working to avoid any challenges to this belief. Honest people are those who put such beliefs to good scrutiny (not just talk about putting such beliefs to scrutiny).
The honesty is elusive matter, when the belief works like that dragon in the garage. When you are lying, you have to deceive computational processes that are roughly your equals. That excludes all straightforward approaches to lying, such as waking up in the morning and thinking ‘how can i be really bad and evil today?’. Lying is a complicated process, with many shortcuts when it comes to the truth. I define lying as successful generation of convincing untruths—a black box definition without getting into details with regards to what parts of the cortex are processing the truth and what are processing the falsehoods. (I exclude the inconsistent accidental generation of such untruths by mistake, unless the mistakes are being chosen)
Hrm? If Newton and Kepler were deluded by mysticism, they were self-deluded. They weren’t toeing a party line and they weren’t echoing conventional views. They sat down and thought hard and came up with beliefs that seem pretty nuts to us.
I see that you want to label it as “not honest” if they don’t put those beliefs to good scrutiny. I think you are using “honest” in a non-standard (and possibly circular) way here. We can’t easily tell from the outside how much care they invested in forming those beliefs, or how self-deluded they are. All we can tell is whether the belief, in retrospect, seems to have been plausible given the evidence available at the time. If you want to label it as “not honest” when it seems wacky to us, then yes, tautologically honest people don’t come to have wacky beliefs.
The impression I have is that N and K (and many scientists since) weren’t into numerology or mysticism to impress their peers or to receive external benefits: they really did believe, based on internal factors.
Did they make a living out of those beliefs?
See, what we have here is a belief cluster that makes the belief-generator feel very good (saving the world, the other smart people are less smart, etc etc) and pays his bills. That is awfully convenient for a reasoning error. Not saying that it is entirely impossible to have a serendipitously useful reasoning error, but doesn’t seem likely.
edit: note, I’m not speaking about some inconsequential honesty in idle thought, or anything likewise philosophical. I’m speaking of not exploiting others for money. There’s nothing circular about the notion that honest person would not be talking a friend into paying him upfront to fix the car when that honest person does not have any discernible objective reason what so ever to think that he could fix the car, and a dishonest person would talk friend into paying. Now, if we were speaking of a very secretive person that doesn’t like to talk of himself, there would’ve been probability of some big list of impressive accomplishments we haven’t heard of...
It’s possible we are just using terms differently. I agree that people are biased by their self-interest. I just don’t think that bias is a form of dishonesty. It’s a very easy mistake to make, and nearly impossible to prevent.
I don’t think SIAI is unusually guilty of this or unusually dishonest.
In science, everybody understands that researchers are biased to believing their own results and to believing new results that make their existing work more important. Most professional scientists are routinely in the position of explaining to funding agencies why their work is extremely important and needs lots of government grant dollars. Everybody, not just SIAI, has to talk donors into funding their Very Important OWrk.
For government grants and prestigious publications, we try to mitigate the bias by having expert reviewers. We also tolerate a certain amount of slop. SIAI is cutting out the government and trying to convince the public, directly, to fund their work. It’s an unusual strategy, but I don’t see that it’s dishonest or immoral or even necessarily unwise.
You are declaring everything gray here so that verbally everything is equal.
There are people with no knowledge in physics and no inventions to their name, whose first ‘invention’ is a perpetual motion device. You really don’t see anything dishonest about holding an unfounded belief that you’re this smart? You really see nothing dishonest about accepting money under this premise without doing due diligence such as trying yourself at something testable, even if you think you’re this smart?
There are scientists whom are trying very hard to follow processes that are not prone to error, people trying to come up with ways to test their beliefs, do you really see them as all equal in the level of dishonesty?
There are people whom are honestly trying to make a perpetual motion device, whom sink their money into it, and never produce anything that they can show to investors, because they are honest and don’t use hidden wires etc. (The equivalent would have Eliezer moving out to a country with a very cheap living, canceling his cryonics subscription, and so on, to maximize the money available for doing the very important work in question)
You can talk all day in qualitative terms how it is the same, state unimportant difference as the only one, and assert that you ‘don’t see the moral difference’, but this ‘counter argument’ you’re making is entirely generic and equally applicable to any form of immoral or criminal conduct. A court wouldn’t be the least bit impressed.
Also, I don’t go philosophical. I don’t care what’s going on inside the head unless i’m interested in neurology. I know that the conduct is dishonest, and the beliefs under which the honest agent would have such conduct lack foundation, there isn’t some honest error here that did result in belief that leads honest agent to adopt such conduct. The convincing liars don’t seem to work by thinking ‘how could I lie’, they just adopt the convenient falsehood as high priority axiom for talk and as low priority axiom for walk, as to resolve contradictions in most useful way, and that makes it very very murky as to what they actually believe.
You can say that it is honest to act on a belief, but that’s an old idea, and nowadays things are more sophisticated and it is a get out of jail free card for almost all liars, whom first make up a very convenient, self serving false belief with not a trace of honesty to the belief making process, and then act on it.
Your hypothetical is a good one. And you are correct: I don’t think you are dishonest if you are sincerely trying to build or sell a perpetual motion machine. You’re still wrong, and even silly, but not dishonest. I need a word to refer to conscious knowing deception, and “dishonest” is the most useful word for the purpose. I can’t let you use it for some other purpose; I need it where it is.
The argument is not applicable to all criminal conduct. In American criminal law, we pay a lot of attention to the criminal’s state of mind. Having the appropriate criminal state of mind is an essential element of many crimes. It’s not premeditated murder if you didn’t expect the victim to die. It’s not burglary if you thought you lived there. It’s utterly routine—and I think morally necessary—to ask juries “what what the defendant’s intention or state of mind”. There is a huge moral and practical difference between a conscious and an unconscious criminal. Education much more easily cures the latter, while punishment is comparatively ineffective. For the conscious criminal, the two are reversed: punishment is often appropriate, whereas education has limited benefits.
I don’t believe I am giving liars a get-out-of-jail-free card. Ignorance isn’t an unlimited defense, and I don’t think it is so easy to convince an outside observer (or a jury) that you’re ignorant in cases where knowledge would be expected. If you really truly are in a state of pathological ignorance and it’s a danger to others, we might lock you up as a precaution, but you wouldn’t be criminally liable.
As to scientific ethics: All human processes have a non-zero chance of errors. The scientists I know are pretty cynical about the process. They are fighting to get papers published and they know it. But they do play by the rules—they won’t falsify data or mislead the reader. And they don’t want to publish something if they’ll be caught-out having gotten something badly wrong. As a result, the process works pretty much OK. It moves forward on average.
I think SIAI is playing by similar rules. I’ve never seen them caught lying about some fact that can be reliably measured. I’ve never seen evidence they are consciously deceiving their audience. If they submitted their stuff to a scientific publication and I were the reviewer, I might try to reject the paper, but I wouldn’t think of trying to have them disciplined for submitting it. In science, we don’t accuse people of misconduct for being wrong, or pigheaded, or even for being overly biased by your self-interest. Is there some more serious charge you can bring against SIAI? How are they worse than any scientist fighting for a grant based on shakey evidence?
I think you have somewhat simplistic idea of justice… there is the “voluntary manslaughter”, there’s the “gross negligence”, and so on. I think SIAI falls under the latter category.
Quantitatively, and by a huge amount. edit: Also, the of beliefs, that they claim to hold, when hold honestly, result in massive loss of resources such as moving to cheaper country to save money, etc etc. I dread to imagine what would happen to me if I honestly were this mistaken about AI. The erroneous beliefs damage you.
The lying is about having two sets of incompatible beliefs, that are picked between based on convenience.
edit: To clarify, the justice is not about the beliefs held by the person. It is more about the process that the person is using to arriving at the actions (see the whole ‘reasonable person’ stuff). If A wants to kill B and A edits A’s beliefs to be “B is going to kill me”, and then acts in self defense and kills B, if the justice system had a log of A’s processing, the A would go for premeditated murder. Even though at the time of the murder A is honestly acting in self defense. (Furthermore, lacking a cross neurophysiological anomaly, it is a fact of reality that the justice can only act based on inputs and outputs of agents)