I would guess that a lot (perhaps most) of time, “salvage epistemology” is a rationalization to give to rationalists to justify their interest in woo, as opposed to being the actual reason they are interested in the woo. (I still agree that the concept is likely hazardous to some people.)
There is also a related phenomenon: when a community that otherwise/previously accepted only people who bought into that community’s basic principles (aspiration to rationality, belief in the need for clear reasoning, etc.) adopts “salvage epistemology”, that community now opens itself up to all manner of people who are, shall we say, less committed to those basic principles, or perhaps not committed at all. This is catastrophic for the community’s health, sanity, integrity, ability to accomplish anything, and finally its likelihood of maintaining those very basic principles.
In other words, there is a difference between a community of aspiring rationalists of whom some have decided to investigate various forms of woo (to see what might be salvaged therefrom)—and the same community which has a large contingent of woo-peddlers and woo-consumers, of whom none believe in rationalist principles in the first place, but are only there to (at best) hang out with fellow peddlers and consumers of woo. The former community might be able to maintain some semblance of sanity even while they make their salvage attempts; the latter community is doomed.
It is difficult to distinguish between (1) people who think that there may be some value in a woo, and it is worth exploring it and separating the wheat from the chaff, and (2) people who believe that the woo is useful, and their only question is how to make it more palatable for the rationalist community. Both these groups are together opposed to people who would refuse to touch the woo in principle.
The subtle difference between those two groups is the absence or presence of motivated reasoning. If you are willing to follow evidence wherever it may lead you, you are open to the possibility that the horoscopes may actually correlate with something useful, but you are also open to the possibility that they might not. While the “salvage at all costs” group already knows that the horoscopes are useful, and are useful in more or less the traditional way, the only question is how to convince the others, who are immune to the traditional astrological arguments, but it seems mostly like the question of using the right lingo, so perhaps if we renamed Pisces to “cognitive ichthys”, the usual objections would stop and rationalists would finally accept that Pisces might actually be cognitively different from other people; especially if a high-status community member supported it publicly with an anecdote or two.
(The opposite kind of mistake would be refusing to accept in principle that being a Virgo might correlate with your success at school… simply because it makes you one of the oldest kids in the classroom.)
Maybe it’s a question of timing. First prove that some part of the woo makes sense; then use it. Do not simply start with the assumption that surely most of the woo will be salvageable somehow; it may not.
I think we’re talking about the same distinction? Or did you mean to specifically disagree with me / offer a competing view / etc.?
Maybe it’s a question of timing. First prove that some part of the woo makes sense; then use it. Do not simply start with the assumption that surely most of the woo will be salvageable somehow; it may not.
I’d go further and say: first give us a good reason why we should think that it’s even plausible or remotely likely that there’s anything useful in the woo in question. (Otherwise, what motivates the decision to attempt to “salvage” this particular woo? Why, for example, are you trying to “salvage” Buddhism, and not Old Believer-ism?) How, in other words, did you locate the hypothesis that this woo, out of all the nonsense that’s been purveyed by all the woo-peddlers over the whole history of humanity, is worth our time and attention to examine for salvageability?
I mostly agree. I believe it is possible—and desirable—in theory to do the “salvage epistemology” correctly, but sadly I suspect that in practice 90% of wannabe rationalists will do it incorrectly.
Not sure what is the correct strategy here, because telling people “be careful” will probably just result in them saying “yes, we already are” when in fact they are not.
Why, for example, are you trying to “salvage” Buddhism, and not Old Believer-ism?
That actually makes sense. I would assume that each of them contains maybe 5% of useful stuff, but almost all useful stuff of Old Believer-ism is probably shared with the rest of Christianity, and maybe 1⁄3 of it is already “in the water supply” if you grew up in a Christian culture.
Also, the “Buddhism” popular in the West is probably quite different from the original Buddhism; it is filtered for modern audience. Big focus on meditation and equanimity, and mostly silence about Buddha doing literal miracles or how having the tiniest sexual thought will fuck up your afterlife. (So it’s kinda like Jordan B. Peterson’s idea of Christianity, compared to the actual Christianity.) So I wouldn’t be surprised if the Western “Buddhism” actually contained 10% of useful stuff.
But of course, 10% correct still means 90% incorrect. And when I hear some people in rationalist community talk about Buddhism, they do not sound like someone who is 90% skeptical.
Broadly speaking, it’s useful to have a wide range of ideas, because you can’t guarantee that the ideas that are “local” to you are the best ones. It’s gradient descent stuff.
You do say “a lot”/”most”, but at least for me this is totally backwards. I only looked at woo type stuff because it was the only place attempting to explain some aspects of my experience. Rationalists were leaving bits of reality on the floor so I had to look elsewhere and then perform hermeneutics to pick out the useful bits (and then try to bring it back to rationalists, with varying degrees of success).
I would guess that a lot (perhaps most) of time, “salvage epistemology” is a rationalization to give to rationalists to justify their interest in woo, as opposed to being the actual reason they are interested in the woo. (I still agree that the concept is likely hazardous to some people.)
I agree with this.
There is also a related phenomenon: when a community that otherwise/previously accepted only people who bought into that community’s basic principles (aspiration to rationality, belief in the need for clear reasoning, etc.) adopts “salvage epistemology”, that community now opens itself up to all manner of people who are, shall we say, less committed to those basic principles, or perhaps not committed at all. This is catastrophic for the community’s health, sanity, integrity, ability to accomplish anything, and finally its likelihood of maintaining those very basic principles.
In other words, there is a difference between a community of aspiring rationalists of whom some have decided to investigate various forms of woo (to see what might be salvaged therefrom)—and the same community which has a large contingent of woo-peddlers and woo-consumers, of whom none believe in rationalist principles in the first place, but are only there to (at best) hang out with fellow peddlers and consumers of woo. The former community might be able to maintain some semblance of sanity even while they make their salvage attempts; the latter community is doomed.
It is difficult to distinguish between (1) people who think that there may be some value in a woo, and it is worth exploring it and separating the wheat from the chaff, and (2) people who believe that the woo is useful, and their only question is how to make it more palatable for the rationalist community. Both these groups are together opposed to people who would refuse to touch the woo in principle.
The subtle difference between those two groups is the absence or presence of motivated reasoning. If you are willing to follow evidence wherever it may lead you, you are open to the possibility that the horoscopes may actually correlate with something useful, but you are also open to the possibility that they might not. While the “salvage at all costs” group already knows that the horoscopes are useful, and are useful in more or less the traditional way, the only question is how to convince the others, who are immune to the traditional astrological arguments, but it seems mostly like the question of using the right lingo, so perhaps if we renamed Pisces to “cognitive ichthys”, the usual objections would stop and rationalists would finally accept that Pisces might actually be cognitively different from other people; especially if a high-status community member supported it publicly with an anecdote or two.
(The opposite kind of mistake would be refusing to accept in principle that being a Virgo might correlate with your success at school… simply because it makes you one of the oldest kids in the classroom.)
Maybe it’s a question of timing. First prove that some part of the woo makes sense; then use it. Do not simply start with the assumption that surely most of the woo will be salvageable somehow; it may not.
I think we’re talking about the same distinction? Or did you mean to specifically disagree with me / offer a competing view / etc.?
I’d go further and say: first give us a good reason why we should think that it’s even plausible or remotely likely that there’s anything useful in the woo in question. (Otherwise, what motivates the decision to attempt to “salvage” this particular woo? Why, for example, are you trying to “salvage” Buddhism, and not Old Believer-ism?) How, in other words, did you locate the hypothesis that this woo, out of all the nonsense that’s been purveyed by all the woo-peddlers over the whole history of humanity, is worth our time and attention to examine for salvageability?
I mostly agree. I believe it is possible—and desirable—in theory to do the “salvage epistemology” correctly, but sadly I suspect that in practice 90% of wannabe rationalists will do it incorrectly.
Not sure what is the correct strategy here, because telling people “be careful” will probably just result in them saying “yes, we already are” when in fact they are not.
That actually makes sense. I would assume that each of them contains maybe 5% of useful stuff, but almost all useful stuff of Old Believer-ism is probably shared with the rest of Christianity, and maybe 1⁄3 of it is already “in the water supply” if you grew up in a Christian culture.
Also, the “Buddhism” popular in the West is probably quite different from the original Buddhism; it is filtered for modern audience. Big focus on meditation and equanimity, and mostly silence about Buddha doing literal miracles or how having the tiniest sexual thought will fuck up your afterlife. (So it’s kinda like Jordan B. Peterson’s idea of Christianity, compared to the actual Christianity.) So I wouldn’t be surprised if the Western “Buddhism” actually contained 10% of useful stuff.
But of course, 10% correct still means 90% incorrect. And when I hear some people in rationalist community talk about Buddhism, they do not sound like someone who is 90% skeptical.
Broadly speaking, it’s useful to have a wide range of ideas, because you can’t guarantee that the ideas that are “local” to you are the best ones. It’s gradient descent stuff.
You do say “a lot”/”most”, but at least for me this is totally backwards. I only looked at woo type stuff because it was the only place attempting to explain some aspects of my experience. Rationalists were leaving bits of reality on the floor so I had to look elsewhere and then perform hermeneutics to pick out the useful bits (and then try to bring it back to rationalists, with varying degrees of success).