I have been getting interested in Vassarism recently, but this post makes me think that is a bad decision. I was otherwise just about to set up a meeting where he would teach me stuff while I pay him $250/h. This now seems like a bad idea.
The one-sentence summary as I understand it would be “The forms of discourse considered to be civilized/nice by elites (in the education system, workplace and politics) works by obscuring and suppressing information; we need to talk about this so we can stop it”.
According to Michael Vassar, the core schtick of rationalism is that we want truth-promoting discourse. So the followup implication if the Vassarites are right is that Vassarism is the proper continuation to rationalism.
I guess I should add, the Vassarites are especially concerned whith this phenomenon when it acts to protect corrupt people in power, and a lot of the controversy between the Vassarites and rationalist institutions such as MIRI/CEA/CFAR is about the Vassarites arguing that those institutions are guilty of this too.
Dunno, maybe Quakers. But the point is not that rationalism is especially egregrious about how much it does it, but rather that the promise of rationalism is to do better.
(And! Some of the key rationalist concerns are bottlenecked on information-suppression. Like a lot of people deploy the information suppression strategies against AI x-risk.)
All gurus are grifters. It’s one of those things that seem like unfounded generalizations, then you get a little bit of firsthand experience and go “ohhh that’s why it was common sense”.
I have been getting interested in Vassarism recently, but this post makes me think that is a bad decision. I was otherwise just about to set up a meeting where he would teach me stuff while I pay him $250/h. This now seems like a bad idea.
What is “Vassarism”, anyway? Could you (or anyone else) give an “executive summary”?
The one-sentence summary as I understand it would be “The forms of discourse considered to be civilized/nice by elites (in the education system, workplace and politics) works by obscuring and suppressing information; we need to talk about this so we can stop it”.
An example that I’m not personally familiar with but which seems broadly accepted by non-Vassarite rationalists would be how tech startups appeal to funders: https://www.lesswrong.com/posts/3JzndpGm4ZgQ4GT3S/parasitic-language-games-maintaining-ambiguity-to-hide (this is not written by a Vassarite)
Another example that is not written by a Vassarite but which seems relevant: https://slatestarcodex.com/2017/06/26/conversation-deliberately-skirts-the-border-of-incomprehensibility/
Vassar didn’t like my recent substack post but he did really like White Fragility and from what I’ve heard (not from a Vassarite) this blog post I also linked to in my substack post contains the important part of White Fragility: https://thehumanist.com/magazine/july-august-2015/fierce-humanism/the-part-about-black-lives-mattering-where-white-people-shut-up-and-listen/
According to Michael Vassar, the core schtick of rationalism is that we want truth-promoting discourse. So the followup implication if the Vassarites are right is that Vassarism is the proper continuation to rationalism.
yeah I mean makes sense. my question is whether his style also obscures things. rotation can cut up a shape, if the lens isn’t lined up to the types.
I see, thank you.
I guess I should add, the Vassarites are especially concerned whith this phenomenon when it acts to protect corrupt people in power, and a lot of the controversy between the Vassarites and rationalist institutions such as MIRI/CEA/CFAR is about the Vassarites arguing that those institutions are guilty of this too.
Are there any institutions, according to Vassarites, that are not guilty of this?
Dunno, maybe Quakers. But the point is not that rationalism is especially egregrious about how much it does it, but rather that the promise of rationalism is to do better.
(And! Some of the key rationalist concerns are bottlenecked on information-suppression. Like a lot of people deploy the information suppression strategies against AI x-risk.)
All gurus are grifters. It’s one of those things that seem like unfounded generalizations, then you get a little bit of firsthand experience and go “ohhh that’s why it was common sense”.