Expressing negative judgments of someone’s intellectual output could be an honest report, generated by looking at the output itself and extrapolating a pattern. Epistemically speaking, this is fine. Alternatively, it could be motivated by something more like politics; someone gets offended, or has a conflict of interest, then evaluates things in a biased way. Epistemically speaking, this is not fine.
So, if I were to take a stab at what the true rule of epistemic conduct here is, the primary rule would be that you ought to evaluate the ideas first before evaluating the person, in your own thinking. There are also reasons why the order of evaluations should also be ideas-before-people in the written product; it sets a better example of what thought processes are supposed to look like, it’s less likely to mislead people into biased evaluations of the ideas; but this is less fundamental and less absolute than the ordering of the thinking.
But.
Having the order-of-evaluations wrong in a piece of writing is evidence, in a Bayesian sense, of having also had the order-of-evaluations wrong in the thinking that generated it. Based on the totality of omnizoid’s post, I think in that case, it was an accurate heuristic. The post is full of overreaches and hyperbolic language. It presents each disagreement as though Eliezer were going against an expert consensus, when in fact each position mentioned is one where he sided with a camp in an extant expert divide.
And...
Over in the legal profession, they have a concept called “appearance of impropriety”, which is that, for some types of misconduct, they consider it not only important to avoid the misconduct itself but also to avoid doing things that look too similar to misconduct.
If I translate that into something that could be the true rule, it would be something like: If an epistemic failure mode looks especially likely, both in the sense of a view-from-nowhere risk analysis and in the sense that your audience will think you’ve fallen into the failure mode, then some things that would normally be epistemically superogatory become mandatory instead.
Eliezer’s criticism of Steven J. Gould does not follow the stated rule, of responding to a substantive point before making any general criticism of the author. I lean towards modus tollens over modus pollens, that this makes the criticism of Steven J. Gould worse. But how much worse depends on whether that’s a reflection of an inverted generative process, or an artifact of how he wrote it up. I think it was probably the latter.
I expect that most people (with an opinion) evaluated Yudkowsky’s ideas prior to evaluating him as a person. After all, Yudkowsky is an author, and almost all of his writing is intended to convey his ideas. His writing has a broader reach, and most of his readers have never met him. I think the linked post is evidence that omnizoid in particular evaluated Yudkowsky’s ideas first, and that he initially liked them.
It’s not clear to me what your hypothesis is. Does omnizoid have a conflict of interest? Were they offended by something? Are they lying about being a big fan for two years? Do they have some other bias?
Even if someone is motivated by an epistemic failure mode, I would still like to see the bottom line up front, so I can decide whether to read, and whether to continue reading. Hopefully the failure mode will be obvious and I can stop reading sooner. I don’t want a norm where authors have to guess whether the audience will accuse them of bias in order to decide what order to write their posts in.
Having the order-of-evaluations wrong in a piece of writing is evidence, in a Bayesian sense, of having also had the order-of-evaluations wrong in the thinking that generated it.
As I understand it, this is an accusation of an author having written the bottom line first—yes?
If so, it would be good to be clear on that point. In other words, we should be clear that the problem isn’t anything about ordering, but that the author’s stated reasons, and reasoning, were actually not what led them to their stated conclusion.
And there is another point. Nobody[1] starts disliking someone for no reason, as a wholly uncaused act of hate-thought. So there must’ve been some other reason why the author of a post attacking someone (e.g. Eliezer, S. J. Gould, etc.) decided that said person was bad/wrong/whatever. But since the stated reason isn’t (we claim) the real reason, therefore the real reason must be something else which we are not being told.
So the other half of the accusation is that the post author is hiding from us the real reasons why they believe their conclusion (while bamboozling us with fake reasons).
This again has nothing to do with any questions of ordering. Clarity of complaints is paramount here.
//It presents each disagreement as though Eliezer were going against an expert consensus, when in fact each position mentioned is one where he sided with a camp in an extant expert divide.//
Nope false. There are no academic decision theorists I know of who endorse FDT, no philosophers of mind who agree with Eliezer’s assessment that epiphenomenalism is the term for those who accept zombies, and no relevant experts about consciousness who think that animals aren’t conscious with Eliezer’s confidence—that I know of.
Expressing negative judgments of someone’s intellectual output could be an honest report, generated by looking at the output itself and extrapolating a pattern. Epistemically speaking, this is fine. Alternatively, it could be motivated by something more like politics; someone gets offended, or has a conflict of interest, then evaluates things in a biased way. Epistemically speaking, this is not fine.
So, if I were to take a stab at what the true rule of epistemic conduct here is, the primary rule would be that you ought to evaluate the ideas first before evaluating the person, in your own thinking. There are also reasons why the order of evaluations should also be ideas-before-people in the written product; it sets a better example of what thought processes are supposed to look like, it’s less likely to mislead people into biased evaluations of the ideas; but this is less fundamental and less absolute than the ordering of the thinking.
But.
Having the order-of-evaluations wrong in a piece of writing is evidence, in a Bayesian sense, of having also had the order-of-evaluations wrong in the thinking that generated it. Based on the totality of omnizoid’s post, I think in that case, it was an accurate heuristic. The post is full of overreaches and hyperbolic language. It presents each disagreement as though Eliezer were going against an expert consensus, when in fact each position mentioned is one where he sided with a camp in an extant expert divide.
And...
Over in the legal profession, they have a concept called “appearance of impropriety”, which is that, for some types of misconduct, they consider it not only important to avoid the misconduct itself but also to avoid doing things that look too similar to misconduct.
If I translate that into something that could be the true rule, it would be something like: If an epistemic failure mode looks especially likely, both in the sense of a view-from-nowhere risk analysis and in the sense that your audience will think you’ve fallen into the failure mode, then some things that would normally be epistemically superogatory become mandatory instead.
Eliezer’s criticism of Steven J. Gould does not follow the stated rule, of responding to a substantive point before making any general criticism of the author. I lean towards modus tollens over modus pollens, that this makes the criticism of Steven J. Gould worse. But how much worse depends on whether that’s a reflection of an inverted generative process, or an artifact of how he wrote it up. I think it was probably the latter.
I expect that most people (with an opinion) evaluated Yudkowsky’s ideas prior to evaluating him as a person. After all, Yudkowsky is an author, and almost all of his writing is intended to convey his ideas. His writing has a broader reach, and most of his readers have never met him. I think the linked post is evidence that omnizoid in particular evaluated Yudkowsky’s ideas first, and that he initially liked them.
It’s not clear to me what your hypothesis is. Does omnizoid have a conflict of interest? Were they offended by something? Are they lying about being a big fan for two years? Do they have some other bias?
Even if someone is motivated by an epistemic failure mode, I would still like to see the bottom line up front, so I can decide whether to read, and whether to continue reading. Hopefully the failure mode will be obvious and I can stop reading sooner. I don’t want a norm where authors have to guess whether the audience will accuse them of bias in order to decide what order to write their posts in.
As I understand it, this is an accusation of an author having written the bottom line first—yes?
If so, it would be good to be clear on that point. In other words, we should be clear that the problem isn’t anything about ordering, but that the author’s stated reasons, and reasoning, were actually not what led them to their stated conclusion.
And there is another point. Nobody[1] starts disliking someone for no reason, as a wholly uncaused act of hate-thought. So there must’ve been some other reason why the author of a post attacking someone (e.g. Eliezer, S. J. Gould, etc.) decided that said person was bad/wrong/whatever. But since the stated reason isn’t (we claim) the real reason, therefore the real reason must be something else which we are not being told.
So the other half of the accusation is that the post author is hiding from us the real reasons why they believe their conclusion (while bamboozling us with fake reasons).
This again has nothing to do with any questions of ordering. Clarity of complaints is paramount here.
Exceptions might be caused by mental illness or some such, which is irrelevant here.
//It presents each disagreement as though Eliezer were going against an expert consensus, when in fact each position mentioned is one where he sided with a camp in an extant expert divide.//
Nope false. There are no academic decision theorists I know of who endorse FDT, no philosophers of mind who agree with Eliezer’s assessment that epiphenomenalism is the term for those who accept zombies, and no relevant experts about consciousness who think that animals aren’t conscious with Eliezer’s confidence—that I know of.