It was the same rationale. “We know what’s the best for everybody else, so we will take the power!”
Besides the fact that those revolutionaries were wrong at the beginning, they purged each other throughout the process, so that the most cunning one was selected. Which was even more wrong, than those early revolutionaries were. Or maybe Stalin was more right than Trotsky, who knows, but it didn’t matter very much. Even Lenin was wrong.
But even if Lenin was right, Andropov would still be corrupted.
Having actually lived under a regime that purported to “change human behaviour to be more in line with reality”, my prior for such an attempt being made in good faith to begin with is accordingly low.
Attempts to change society invariably result in selection pressure for effectiveness outmatching those for honesty and benevolence. In a couple of generations, the only people left in charge are the kind of people you definitely wouldn’t want in charge, unless you’re the kind of person nobody wants in charge in the first place.
I’m thinking about locating specific centers of our brains and reducing certain activities which undoubtedly make us less aligned with reality and increase the activations of others.
This is the kind of thinking that, given a few years of unchecked power and primate group competition, leads to mass programs of rearranging people’s brain centres with 15th century technology.
Why don’t you spend some time instead thinking about how your forced rationality programme is going to avoid the pitfall all others so far fell into, megalomania and genocide? And why are you so sure your beliefs are the final and correct ones to force on everyone through brain manipulation? If we had the technology to enforce beliefs a few centuries ago, would you consider it a moral good to freeze the progress of human thought at that point? Because that’s essentially what you’re proposing from the point of view of all potential futures where you fail.
Attempts to change society invariably result in selection pressure for effectiveness outmatching those for honesty and benevolence. In a couple of generations, the only people left in charge are the kind of people you definitely wouldn’t want in charge, unless you’re the kind of person nobody wants in charge in the first place.
You’re excluding being aligned with objective reality (accepting facts, etc) with said effectiveness. Otherwise, it’s useless.
This is the kind of thinking that, given a few years of unchecked power and primate group competition, leads to mass programs of rearranging people’s brain centres with 15th century technology.
I’m unsure why you’re presuming rearranging people’s brains isn’t done constantly independent of our volition. This simply starts questioning how we can do it, with our current knowledge.
Why don’t you spend some time instead thinking about how your forced rationality programme is going to avoid the pitfall all others so far fell into, megalomania and genocide?
Why would it lead to megalomania and genocide, when it’s not aligned with reality? An understanding of neuroscience and evolutionary biology, presuming you were aligned with reality to figure it out and accept facts, would be enough and still understanding that we can be wrong until we know more.
And why are you so sure your beliefs are the final and correct ones to force on everyone through brain manipulation?
As I said “this includes uncertainty of facts (because of facts like an interpretation of QM).” which makes us embrace uncertainty, that reality is probabilistic with this interpretation. It’s not absolute.
If we had the technology to enforce beliefs a few centuries ago, would you consider it a moral good to freeze the progress of human thought at that point
Because that’s essentially what you’re proposing from the point of view of all potential futures where you fail.
Why do you think that?
It was the same rationale. “We know what’s the best for everybody else, so we will take the power!”
Besides the fact that those revolutionaries were wrong at the beginning, they purged each other throughout the process, so that the most cunning one was selected. Which was even more wrong, than those early revolutionaries were. Or maybe Stalin was more right than Trotsky, who knows, but it didn’t matter very much. Even Lenin was wrong.
But even if Lenin was right, Andropov would still be corrupted.
I didn’t really mean that. It was just setting an emotional stage for the rest of the comment. What do you think of the rest?
Having actually lived under a regime that purported to “change human behaviour to be more in line with reality”, my prior for such an attempt being made in good faith to begin with is accordingly low.
Attempts to change society invariably result in selection pressure for effectiveness outmatching those for honesty and benevolence. In a couple of generations, the only people left in charge are the kind of people you definitely wouldn’t want in charge, unless you’re the kind of person nobody wants in charge in the first place.
This is the kind of thinking that, given a few years of unchecked power and primate group competition, leads to mass programs of rearranging people’s brain centres with 15th century technology.
Why don’t you spend some time instead thinking about how your forced rationality programme is going to avoid the pitfall all others so far fell into, megalomania and genocide? And why are you so sure your beliefs are the final and correct ones to force on everyone through brain manipulation? If we had the technology to enforce beliefs a few centuries ago, would you consider it a moral good to freeze the progress of human thought at that point? Because that’s essentially what you’re proposing from the point of view of all potential futures where you fail.
You’re excluding being aligned with objective reality (accepting facts, etc) with said effectiveness. Otherwise, it’s useless.
I’m unsure why you’re presuming rearranging people’s brains isn’t done constantly independent of our volition. This simply starts questioning how we can do it, with our current knowledge.
Why would it lead to megalomania and genocide, when it’s not aligned with reality? An understanding of neuroscience and evolutionary biology, presuming you were aligned with reality to figure it out and accept facts, would be enough and still understanding that we can be wrong until we know more.
As I said “this includes uncertainty of facts (because of facts like an interpretation of QM).” which makes us embrace uncertainty, that reality is probabilistic with this interpretation. It’s not absolute.
I’m not.