Not necessarily guilt-by-association, but maybe rather pointing out that the two arguments/conspiracy theories share a similar flawed structure, so if you discredit one, you should discredit the other.
Still, I’m also unsure how much structure they share, and even if they did, I don’t think this would be discursively effective because I don’t think most people care that much about (that kind of) consistency (happy to be updated in the direction of most people caring about it).
I have read the “TESCREAL” paper recently, and wrote some thoughts about it in an ACX Open Thread.
It also gave me conspiracy theory vibes, as it tried too hard to connect together various groups and people that are parts of the sinister-sounding “TESCREAL” (including a table of individuals and organizations involved in various parts), trace their roots back to eugenicists (but also Plato and Aristotle), and warn about their wealth and influence.
It reminded me how some people in my country love to compile lists of people working at various non-profits to prove how this is all linked to Soros and how they are all servants of American propaganda trying to destroy our independence. Because apparently you cannot volunteer in a shelter for abandoned puppies without being a part of some larger sinister plot.
From the Dark Arts perspective, I think it would be useful to sigh and say “oh, this conspiracy theory again?” to signal that you consider the authors low-status. But then focus on the object-level objections.
The actual objection, from my perspective, is that the thing that connects the parts of the “TESCREAL” is simply “nerds who care, and think that technology is the answer”. Some parts are more strongly related; if you believe in technological progress, then longtermism and transhumanism and extropianism and cosmism are more or less the same thing, the belief that in future, humans will overcome their current limitations using technology. That should not really come as huge a surprise for anyone.
The connection with EA is cherry-picking; yes, there are some longtermist projects, but most of it is stuff like curing malaria. But of course, you can’t say that, if your agenda is to call them Nazis eugenicists.
And the connection with eugenicists is mostly “you know who else worried about the future of humanity?” (I find it difficult to think of a more appropriate response than “fuck you!”) But also, speaking about intelligence is a taboo, which means that it is a taboo to worry about artificial intelligences becoming potentially smarter than humans. -- Here, I think a potential solution would be to push the authors towards making some object-level statements. Not just “people who say X are like Hitler eugenicists”, but state your opinion clearly, whether it is “X” or “not X”; make a falsifiable statement.
But I think it is not too uncharitable to summarize the paper as “a conspiracy theory claiming that people who donate money to African charities that cure malaria are secretly eugenicists”, because that is an important part of the “TESCR-EA-L” construct.
Not necessarily guilt-by-association, but maybe rather pointing out that the two arguments/conspiracy theories share a similar flawed structure, so if you discredit one, you should discredit the other.
Still, I’m also unsure how much structure they share, and even if they did, I don’t think this would be discursively effective because I don’t think most people care that much about (that kind of) consistency (happy to be updated in the direction of most people caring about it).
I have read the “TESCREAL” paper recently, and wrote some thoughts about it in an ACX Open Thread.
It also gave me conspiracy theory vibes, as it tried too hard to connect together various groups and people that are parts of the sinister-sounding “TESCREAL” (including a table of individuals and organizations involved in various parts), trace their roots back to eugenicists (but also Plato and Aristotle), and warn about their wealth and influence.
It reminded me how some people in my country love to compile lists of people working at various non-profits to prove how this is all linked to Soros and how they are all servants of American propaganda trying to destroy our independence. Because apparently you cannot volunteer in a shelter for abandoned puppies without being a part of some larger sinister plot.
From the Dark Arts perspective, I think it would be useful to sigh and say “oh, this conspiracy theory again?” to signal that you consider the authors low-status. But then focus on the object-level objections.
The actual objection, from my perspective, is that the thing that connects the parts of the “TESCREAL” is simply “nerds who care, and think that technology is the answer”. Some parts are more strongly related; if you believe in technological progress, then longtermism and transhumanism and extropianism and cosmism are more or less the same thing, the belief that in future, humans will overcome their current limitations using technology. That should not really come as huge a surprise for anyone.
The connection with EA is cherry-picking; yes, there are some longtermist projects, but most of it is stuff like curing malaria. But of course, you can’t say that, if your agenda is to call them
Naziseugenicists.And the connection with eugenicists is mostly “you know who else worried about the future of humanity?” (I find it difficult to think of a more appropriate response than “fuck you!”) But also, speaking about intelligence is a taboo, which means that it is a taboo to worry about artificial intelligences becoming potentially smarter than humans. -- Here, I think a potential solution would be to push the authors towards making some object-level statements. Not just “people who say X are like
Hitlereugenicists”, but state your opinion clearly, whether it is “X” or “not X”; make a falsifiable statement.But I think it is not too uncharitable to summarize the paper as “a conspiracy theory claiming that people who donate money to African charities that cure malaria are secretly eugenicists”, because that is an important part of the “TESCR-EA-L” construct.