I think this constitutes a rejection of rationalism and effwctive altruism?
Well, I do reject EA, or rather its intellectual foundation in Peter Singer and radical utilitarianism. But that’s a different discussion, involving the motte-and-bailey of “Wouldn’t you want to direct your efforts in the most actually effective way?” vs “Doing good isn’t the most important thing, it’s the only thing”.
Rationalism in general, understood as the study and practice of those ways of thought and action that reliably lead towards truth and effectiveness and not away from them, yes, that’s a good thing. Eliezer founded LessWrong (and before that, co-founded Overcoming Bias) because he was already motivated by the threat of AGI, but saw a basic education in how to think as a prerequisite for anyone to be capable of having useful ideas about AGI. The AGI threat drove his rationalism outreach, rather than rationalism leading to the study of how to safely develop AGI.
Carnists seem to believe that …
I notice that people who eat meat are generally willing to accommodate vegetarians when organising a social gathering, and perhaps also vegans but not necessarily. I would expect them to throw out any vegan who knowingly comes into a non-vegan setting and starts screaming about dead animals.
More generally, calling anyone who doesn’t care about someone’s ideology because they have better things to think about “ideological” is on the way to saying “everything is ideological, everything is political, everything is problematic, and if you’re not for us you’re against us”. And some people actually say that. I think they’re crazy, and if I see them breaking and entering, I’ll call the police on them.
Rationalism in general, understood as the study and practice of those ways of thought and action that reliably lead towards truth and effectiveness and not away from them, yes, that’s a good thing. Eliezer founded LessWrong (and before that, co-founded Overcoming Bias) because he was already motivated by the threat of AGI, but saw a basic education in how to think as a prerequisite for anyone to be capable of having useful ideas about AGI. The AGI threat drove his rationalism outreach, rather than rationalism leading to the study of how to safely develop AGI.
Maybe a way to phrase my objection/confusion is:
In this quote, it seems like you are admitting that the epistemic environment does influence subject (“thought”) and action on some “small scale”. Like for instance rationalism might make people focus on questions like instrumental convergence and human values (good epistemics) instead of the meaning of life (bad epistemics due to lacking concepts of orthogonality), and might e.g. make people focus on regulating rather than accelerating AI.
Now my thought would be that if it influences subject and action on the small scale, then presumably it also influences subject and action on the large scale. After all, there’s no obvious distinction between the scales. Conversely, I guess now I infer that you have some distinction you make between these scales?
I notice that people who eat meat are generally willing to accommodate vegetarians when organising a social gathering, and perhaps also vegans but not necessarily. I would expect them to throw out any vegan who knowingly comes into a non-vegan setting and starts screaming about dead animals.
I didn’t say anything about screaming. It could go something like this:
Amelia’s living room was a dance of warm hues with fairy lights twinkling overhead. Conversations ebbed and flowed as guests exchanged stories and laughter over drinks. The centerpiece of the food table was a roasted chicken, its golden-brown skin glistening under the ambient light.
As guests approached to fill their plates, Luna, with her striking red hair, made her way to the table. She noticed the chicken and paused, taking a deep breath.
Turning to a group that included Rob, Amelia, and a few others she didn’t know well, she said, “It always makes me a bit sad seeing roasted chickens at gatherings.” The group paused, forks midway to their plates, to listen to her. “Many of these chickens are raised in conditions where they’re tightly packed and can’t move freely. They’re bred to grow so quickly that it causes them physical pain.”
Increasing the volume of one’s speech is physically unpleasant and makes it harder for others to get a word in, though with the advantage being that it is easier to hear when there is background noise. Thus screaming would be indicative of there being something non-truthseeking (albeit not necessarily from the screamer, as they might be trying to overwhelm others who are being non-truthseeking, though in practice I expect that either both would be truthseeking or both would be non-truthseeking).
More generally, calling anyone who doesn’t care about someone’s ideology because they have better things to think about “ideological” is on the way to saying “everything is ideological, everything is political, everything is problematic, and if you’re not for us you’re against us”. And some people actually say that.
I don’t think one can avoid ideologies, or that it would be desirable to do so.
Turning to a group that included Rob, Amelia, and a few others she didn’t know well, she said, “It always makes me a bit sad seeing roasted chickens at gatherings.” The group paused, forks midway to their plates, to listen to her. “Many of these chickens are raised in conditions where they’re tightly packed and can’t move freely. They’re bred to grow so quickly that it causes them physical pain.”
One of them replies with a shrug, “So I’ve heard. I can believe it.” Another says, “You knew this wasn’t a vegan gathering when you decided to come.” A third says, “You have said this; I have heard it. Message acknowledged and understood.” A fourth says, “This is important to you; but it is not so important to me.” A fifth says “I’m blogging this.” They carry on gnawing at the chicken wings in their hands.
These are all things that I might say, if I were inclined to say anything at all.
In this quote, it seems like you are admitting that the epistemic environment does influence subject (“thought”) and action on some “small scale”. Like for instance rationalism might make people focus on questions like instrumental convergence and human values (good epistemics) instead of the meaning of life (bad epistemics due to lacking concepts of orthogonality), and might e.g. make people focus on regulating rather than accelerating AI.
By “epistemic environment” I understand the standard of rationality present there. Rationality is a tool that can be deployed towards any goal. A sound epistemic environment is no guarantee that the people in it espouse any particular morality.
I agree that morality is not solely determined by epistemics; the orthogonality thesis holds true. However people’s opinions will also be influenced by their information, due to e.g. expected utility and various other things.
Well, I do reject EA, or rather its intellectual foundation in Peter Singer and radical utilitarianism. But that’s a different discussion, involving the motte-and-bailey of “Wouldn’t you want to direct your efforts in the most actually effective way?” vs “Doing good isn’t the most important thing, it’s the only thing”.
Rationalism in general, understood as the study and practice of those ways of thought and action that reliably lead towards truth and effectiveness and not away from them, yes, that’s a good thing. Eliezer founded LessWrong (and before that, co-founded Overcoming Bias) because he was already motivated by the threat of AGI, but saw a basic education in how to think as a prerequisite for anyone to be capable of having useful ideas about AGI. The AGI threat drove his rationalism outreach, rather than rationalism leading to the study of how to safely develop AGI.
I notice that people who eat meat are generally willing to accommodate vegetarians when organising a social gathering, and perhaps also vegans but not necessarily. I would expect them to throw out any vegan who knowingly comes into a non-vegan setting and starts screaming about dead animals.
More generally, calling anyone who doesn’t care about someone’s ideology because they have better things to think about “ideological” is on the way to saying “everything is ideological, everything is political, everything is problematic, and if you’re not for us you’re against us”. And some people actually say that. I think they’re crazy, and if I see them breaking and entering, I’ll call the police on them.
Maybe a way to phrase my objection/confusion is:
In this quote, it seems like you are admitting that the epistemic environment does influence subject (“thought”) and action on some “small scale”. Like for instance rationalism might make people focus on questions like instrumental convergence and human values (good epistemics) instead of the meaning of life (bad epistemics due to lacking concepts of orthogonality), and might e.g. make people focus on regulating rather than accelerating AI.
Now my thought would be that if it influences subject and action on the small scale, then presumably it also influences subject and action on the large scale. After all, there’s no obvious distinction between the scales. Conversely, I guess now I infer that you have some distinction you make between these scales?
I didn’t say anything about screaming. It could go something like this:
Increasing the volume of one’s speech is physically unpleasant and makes it harder for others to get a word in, though with the advantage being that it is easier to hear when there is background noise. Thus screaming would be indicative of there being something non-truthseeking (albeit not necessarily from the screamer, as they might be trying to overwhelm others who are being non-truthseeking, though in practice I expect that either both would be truthseeking or both would be non-truthseeking).
I don’t think one can avoid ideologies, or that it would be desirable to do so.
One of them replies with a shrug, “So I’ve heard. I can believe it.” Another says, “You knew this wasn’t a vegan gathering when you decided to come.” A third says, “You have said this; I have heard it. Message acknowledged and understood.” A fourth says, “This is important to you; but it is not so important to me.” A fifth says “I’m blogging this.” They carry on gnawing at the chicken wings in their hands.
These are all things that I might say, if I were inclined to say anything at all.
Valid responses.
By “epistemic environment” I understand the standard of rationality present there. Rationality is a tool that can be deployed towards any goal. A sound epistemic environment is no guarantee that the people in it espouse any particular morality.
I agree that morality is not solely determined by epistemics; the orthogonality thesis holds true. However people’s opinions will also be influenced by their information, due to e.g. expected utility and various other things.