I don’t know much about any of that, but blaming the first on Alinsky sounds just ridiculous
True, I was exaggerating by blaming him for the effects of the movement he was a part of.
And do you think he only worked with blacks,
No, and I’m sure he did some similar damage to some white communities as well.
Are successful (non-criminal) black businessmen hated and despised in their communities?
Well, depends on how they succeeded (someone who succeeded in sports or music is more accepted then someone who succeeded through business).
(Overall, you sound a touch mind-killed.)
What about yourself? At the risk of engaging in internet cold reading I think you were so scarred by what you perceive as “right wing technocracy” as expressed by Moldbug and some of his fans on LW that you’re desperately looking for any ideology/movement that seems strong enough to oppose it.
At the risk of engaging in internet cold reading I think you were so scarred by what you perceive as “right wing technocracy” as expressed by Moldbug and some of his fans on LW that you’re desperately looking for any ideology/movement that seems strong enough to oppose it.
Well, there’s a grain of truth to that, but I’ll try not to compromise my ethics in doing so. I’d put it like this: I have my ideology-as-religion (utopian socialism, for lack of a better term) and, like with any other, I try to balance its function of formalizing intuitions versus its downsides of blinding me with dogma—but I’m open to investigating all kinds of ideologies-as-politics to see how they measure against my values, in their tools and their aims.
Also, I consider Moldbug to be relatively innocent in the grand scheme. He says some rather useful things, and anyways there are others whose thoughts are twisted far worse by that worldview I loathe; he’s simply a good example (IMO) of a brilliant person exhibiting symptoms of that menace.
My good sir if you are a utopian socialist, it unfortunately seems to me that you are striving to treat a fungal infection while the patient is dying of cancer.
I said it’s my ideal of society, not that I’d start collectivizing everything tomorrow! Didn’t you link that story, Manna? If you approve of its ideas, then you’re at least partly a socialist too—in my understanding of the term. Also, which problems would you call “cancer”, specifically?
I said it’s my ideal of society, not that I’d start collectivizing everything tomorrow!
Oh I didn’t mean to imply you would! But surely you would like to move our current society towards that at some (slow or otherwise) rate, or at least learn enough about the world to eventually get a good plan of doing so.
If you approve of its ideas, then you’re at least partly a socialist too
Nearly every human is I think. Socialism and its variants tap into primal parts of our mind and its ethical and political intuitions. And taking seriously most of our stated ethics one is hard pressed to not end up a libertarian or a communist or even a fascist. Fortunately most people don’t think too hard about politics. I don’t want the conversation to go down this path too far though since I fear the word “socialist” is a problematic one.
Also, which problems would you call “cancer”, specifically?
Specifically the great power structures opposing moves towards your ideal. It almost dosen’t matter which ideal, since those that I see would oppose most change and I have a hard time considering them benevolent. Even milquetoast regular leftism thinks itself fighting a few such forces, and I would actually agree they are there. You don’t need to agree with their bogeyman, surely you see some much more potent forces shaping our world, that don’t seem inherently interested in your ideals, that are far more powerful than.… the writer of a photocopied essay you picked up on the street?
For Moldbug himself points out, since the barrier to entry to writing a online blog is so low, absent other evidence, you should take him precisely as seriously as a person distributing such photocopied essays. How many people have read anything by Moldbug? Of those how many agree? Of those how many are likely to act? What if you take the entire “alternative” or “dissident” or “new” right and add these people together. Do you get million people? Do you even get 100 thousand? And recall these are dissidents! By the very nature of society outcasts, malcontent’s and misfits are attracted to such thinking.
While I have no problem with you reading right wing blogs, even a whole lot of them, since I certainly do, I feel the need to point out, that you cite some pretty obscure ones that even I have heard about let alone followed, dosen’t that perhaps tell you that you may be operating under a distorted view or intuition of how popular these ideas are? By following their links and comment section your brain is tricked into seeing a different reality from the one that exists, take a survey of political opinion into your hands and check the scale of the phenomena you find troubling.
Putting things into perspective, It seems a waste to lose sleep over them, does it not? Many of them are intelligent and consistent, but then so is Will Newsome and I don’t spend much time worrying about everlasting damnation. If you want anything that can be described as “utopian” or “socialist” your work is cut out for you, you should be wondering how to move mountains, not stomp on molehills.
That’s a good comment, thanks. You’ve slightly misunderstood my feelings and my fears, though. I’ll write a proper response.
In brief, I fear alt-right/technocratic ideas not because they’re in any way popular or “viral” at present, but because I have a nasty gut feeling that they—in a limited sense—do reflect “reality” best of all, that by most naive pragmatist reasoning they follow from facts of life, and that more and more people for whom naive reasoning is more important than social conventions will start to adopt such thinking as soon as they’re alerted to its possibility.
And, crucially, in the age of the internet and such, there will be more and more such under-socialized, smart people growing up and thinking more independently—I fear it could be like the spread of simplified Marxism through underdeveloped and humiliated 3rd-world countries, and with worse consequences. See the Alinsky quote above—“revolution and communism have become one”. If rationalism and techno-fascism become “one” like that, the whole world might suffer for it.
I’m following you from your links in “Nerds are Nuts” and I would like to restate your second paragraph to make sure I have your beliefs rights.
The reason the alt-right is scary is not because they are wrong in their beliefs about reality, but because they are correct about the flaws they see in modern-leftism and this makes their proposals all the more dangerous. Just because a doctor can diagnose what ails you, it does not follow that he knows how to treat you. The Alt Right is correct in it’s diagnosis of societal cancers but their proposals look depressingly closer to leeching than to chemo-therapy.
What positive beliefs about politics do you have in light of your fear of necromancy and cancer? My intuition says some form of pragmatic Burkean conservatism but I don’t want to typecast you.
Well, I respect Burke a lot, but my true admiration goes out to people like Chesterton (a big fan of the French Revolution) and Kropotkin and Orwell and maybe even the better sort of religious leader, like John Paul II—the ones who realize the power and necessity of ideology-as-faith, but take the best from both its fronts instead of being tied down on one side. In short, I love idealism.
(If forced to pick among today’s widely used labels, though, I’d be OK with “socialist” and not at all with “conservative”.)
I thought about this on and off the rest of yesterday and my belief is that these two statements are key.
The Alt Right is correct in it’s diagnosis of societal cancers [...]
In short, I love idealism.
What I get from this is the divide between epistemological and instrumental. Using that classic lesswrong framework I’ve come to this as a representation of your views:
In order to understand the world, if you are going to err, err on the side of Cynicism. But, if you are going to live in it and make it better, you have to err on the side of Idealism.
Cynicism is epistemologically useful but instrumentally desctructive (Explained by the fact you agree with alt-right in the pessimistic view of the world and the reasons things are not as good as they could be.)
Idealism is instrumentally useful but epistemologically destructive. (Explained by the fact you regard ideology-as-faith as vitally useful, but that doesn’t make faith true.)
I really like summarizing to make sure I get things right. Watch as I prove it!
When dealing with real world morality and goal seeking behavior we seem forced to stare in the face the following facts:
We are very biased.
We could be more rational
Our rationality isn’t particularly good at morality.
Complicating this are the following:
Heuristics generally work. How much rationality do you need to out compete moral and procedural heuristics?
Just how rational can we get. Can low IQ people become much more rational, or are we forced to believe in a cognitive and rationality based elite?
Should we trust moral reasoning or heuristics at all?
I’ve seen the following conclusions drawn so far by people who take bias seriously: (There may be more, this is what I’ve encountered. Also the first two are just jokes I couldn’t resist)
Lovecraftian: The world is ruled by evil Gods beyond imagination. I have seen too much! Go back into the warm milkbath of ignorance! Chew your cud you cows and never think of the slaughter to come!
Existientialism: Everything sucks forever but let’s not kill ourselves because it’s better to push a rock up a mountain or something. We can never know anything and nothing can ever mean anything so we should talk about it forever. Give me Tenure! Linkin Park and Tenure!
Moldbuggery: Bias is bad, real fucking bad. The current systems don’t encourage rationality all that much either. Only a cognitive elite can ever become debiased enough to run things and they should only be trusted if we get a system that aligns with the interests of the subjects. (Ex: Aurini, GLADOS, Konkvistador, Moldbug, Nick Land)
[I had a section on Robin Hanson, but I don’t think I understand it well enough to summarize on reflection, so “This Page Left Blank”]
Old Whiggish: We are very biased and ought to trust intuition, tradition and reason roughly in equal measure. We pride reason too much and so people who try to be perfectly rational are worse reasoners than those who allow a little superstition in their life. Our Heuristics are better than we think. If it works, we should keep it even if it isn’t true. (Ex: Taleb, Derbyshire, Burke, Marcus Aurelius. Almost Jonathan Haidt post “The Righteous Mind” but not quite)
Rational Schizophrenia: A pure cynicism about how things are should be combined with an idealism of how to act. [See above for Multithreaded’s advice]
Yudkowskyianism: Bias is very bad but our prospects for debiasing is less pessimistic than either of those make it out to be. Rationality is like marital arts, any can learn to use leverage regardless of cognitive strength. Though there are clear ways in which we fail, now that we have Bayesian Probability theory derived from pure logic we know how to think about these issues. To abuse a CS Lewis quote: “The Way has not been tried and found wanting; it has been found difficult and left untried.” Try it before giving up because something is only undoable until somebody does it. (Ex: Lukeprog, Yudkowsky)
How does that strike you as the current “Rationality landscape?” Again I’m largely new here as a community member so I could be mischaractizing or leaving ideas out.
The first glance, as usual, reveals interesting things about one’s perception:
Moldbuggery: Bias is bad, real fucking bad. The current systems don’t encourage rationality all that much either. Only a cognitive elite can ever become debased enough to run things
That’s honestly how I read it at first. Ha.
BTW Konkvistador belongs in better company (nothing against the others); I’ve come to admire him a little bit and think he’s much wiser than other fans of Moldbug.
Oh, and speaking of good company… “pure cynicism about how things are combined with an idealism of how to act”—that sounds like the ethics that Philip K. Dick tenatively proposes in his Exegesis; shit’s fucked, blind semi-conscious evil rules the world, but there’s a Super-Value to being kind and human even in the face of Armageddon.
I asked Konkvistador if he endorsed the Moldbuggery statement in IRC and he liked it. But I think I want to decontextualize the attitudes toward bias and debiasing So I can better fit different authors/posters together. :/
I’ve come up with /fatalism/pessimism/elitism/rational schizophrenia/optimism . With that breakdown I can put Konvistador in the same category with Plato. I love the name rational schizophrenia too much to give it up.
I’d endorse too (with appropriate caveats about what part of the alt-right I struggle to reject), but the meta-ethical point Karmakaiser is making doesn’t help decide what ethical positions to adopt—only what stance one should take towards one’s adopted moral positions.
“THE REAL PROBLEM OF OUR TIME,” George Orwell wrote in 1944, “is to restore the sense of absolute right and wrong when the belief that it used to rest on—that is, the belief in personal immortality—has been destroyed. This demands faith, which is a different thing from credulity.” It also demands conviction, which is a different thing from wanting to win at any price. The real problem of the left in our time is to restore those absolutes and to find that faith.
Of course, Orwell was not talking about religious faith. Nor am I. Ironically, one of the treasures bequeathed to us by the world’s ethical religions is the self-effacing hint that the basis of morality does not have to be religious. “Whatever you would have others do to you, do to them.” In other words, the most reliable sense of right and wrong comes from your own skin, your own belly, your own broken heart.
That said, religion can provide some useful insights, if only to debunk a few of the notions that are being foisted upon us in the name of religion. The Christian right preaches an extremely selective version of its own creed, long on Leviticus and short on Luke, with scant regard for the Prophets and no end of veneration for the profits. Its message goes largely unchallenged, partly through general ignorance of biblical tradition and partly because liberal believers and nonbelievers alike wish to maintain a respectable distance from the rhetoric of fundamentalism. This amounts to a regrettable abandonment of tactics. One of Saul Alinsky’s “Rules for Radicals” was “Make the enemy live up to their own book of rules”—a tough act to pull off if one doesn’t even know the rule book.
I see, so this is why you seem to often bring up such discussion on LessWrong? Because you see it as a repository of smart, under-socialized, independent thinkers? I do to a certain extent and in this light, your most recent writing appears much more targeted rather than a overblown obsession.
In brief, I fear alt-right/technocratic ideas not because they’re in any way popular or “viral” at present, but because I have a nasty gut feeling that they—in a limited sense—do reflect “reality” best of all, that by most naive pragmatist reasoning they follow from facts of life, and that more and more people for whom naive reasoning is more important than social conventions will start to adopt such thinking as soon as they’re alerted to its possibility.
Do you think this might already be happening? The naive social conventions ignoring utilitarianism we often find ourselves disagreeing with seems to be remarkably widespread among baseline LessWrongers. One merely needs to point out the “techno-facist” means and how well they might work and I can easily see well over a third embracing them, and even more, should criticism of “Cathedral” economic and political theory become better understood and more widespread.
But again remember the “alternative right” has plenty of anti-epistemology and mysticism springing from a fascination with old fascist and to a lesser extent new left intellectuals, this will I think restrain them from fully coalescing around the essentially materialist ethos that you accurately detect is sometimes present.
And even if some of this does happen either from the new right people or from “rationalists” and the cognitive elite, tell me honestly would such a regime and civilization have better or worse odds at creating FAI or surviving existential risk than our own?
And, crucially, in the age of the internet and such, there will be more and more such under-socialized, smart people growing up and thinking more independently
But recall what Vladimir_M pointed out, in order to gain economic or political power one must in the age of the internet be more conformist than before, because any transgression is one google search away. Doesn’t this suggest there will be some stability in the social order for the foreseeable future? Or that if change does happen it will only occur if a new ideal is massively popular so that “everyone” transgresses in its favour. Then punishment via hiring practices, reputation or law becomes ineffective.
Also: a third of LWers embracing technofascism? Is that a reference to a third of angels siding with Lucifer in Paradise Lost? Or was this unintended, a small example of our narrative patterns being very similar from Old Testament to Milton to now?
tell me honestly would such a regime and civilization have better or worse odds at creating FAI or surviving existential risk than our own?
Surviving existential risk, probably. But, unlike today’s inefficient corrupt narrow-minded liberal oligarchy, such a regime would—precisely because of its strengths and the virtues of people who’d rise to the top of it (like objectivity, dislike of a “narrative” approach to life and a cynical understanding of society) - be able to make life hardly worth living for people like us. I don’t know whether the decrease in extinction risk is worth the vastly increased probability of stable and thriving dystopia, where a small managerial caste is unrestrained and unchallenged. Again, democracy and other such modern institutions, pathetic and stupid as they might be from an absolute standpoint, at least prevent real momentous change.
And their “F”AI could well implement many things we’d find awful and dystopian, too (e.g., again, a clean ordered society where slavery is allowed and even children are legally chattel slaves of their parents, to be molded and used freely) - unlike something like this happening with our present-day CEV, it’d be a feature, not a bug. In short, it’s likely a babyeater invasion in essense.
I’m a moral anti-realist through and through, despite believing in god(s). I judge everyone and their lives from my own standpoint. Hell, a good citizen of the Third Reich might’ve found my own life pointless and unworthy of being. Good thing that he’s shot or burnt, then. There’s no neutral evaluation.
I judge everyone and their lives from my own standpoint… There’s no neutral evaluation.
You sound like a subjectivist moral realist.
Possibly even what we tend to call “subjectively objective” (I think we should borrow a turn of phrase from Epistemology and just call it subject-sensitive invariantism).
Specifically the great power structures opposing moves towards your ideal. It almost dosen’t matter which ideal, since those that I see would oppose most change
Keep in mind that while every improvement is a change, most potential changes are not improvements and for most ideals, attempting to implement them leads to total disaster.
Yep. Both him and me have stressed the first half of that several times in one form or other. However, it’s nonsense to say that trying to implement ideals is bad, period, because the problem here is that humans are very bad at some things that would be excellent in themselves—like a benevolent dictatorship. If, for example, we had some way to guarantee that one would stay benevolent, then clearly all other political systems should get the axe—to an utilitarian, there’s no more justification for their evils! But in reality attempts at one usually end in tears.
However, trying to, say, keep one’s desk neat & organized is also an ideal, yet many people, particularly those with OCD, are quite good at implementing it. It is clear, then, that whatever we do, we should first look to psychological realities, and manipulate ourselves in such a way that they assist our stated goals or just don’t hinder them as much.
True, I was exaggerating by blaming him for the effects of the movement he was a part of.
No, and I’m sure he did some similar damage to some white communities as well.
Well, depends on how they succeeded (someone who succeeded in sports or music is more accepted then someone who succeeded through business).
What about yourself? At the risk of engaging in internet cold reading I think you were so scarred by what you perceive as “right wing technocracy” as expressed by Moldbug and some of his fans on LW that you’re desperately looking for any ideology/movement that seems strong enough to oppose it.
Replied elsewhere.
Well, there’s a grain of truth to that, but I’ll try not to compromise my ethics in doing so. I’d put it like this: I have my ideology-as-religion (utopian socialism, for lack of a better term) and, like with any other, I try to balance its function of formalizing intuitions versus its downsides of blinding me with dogma—but I’m open to investigating all kinds of ideologies-as-politics to see how they measure against my values, in their tools and their aims.
Also, I consider Moldbug to be relatively innocent in the grand scheme. He says some rather useful things, and anyways there are others whose thoughts are twisted far worse by that worldview I loathe; he’s simply a good example (IMO) of a brilliant person exhibiting symptoms of that menace.
My good sir if you are a utopian socialist, it unfortunately seems to me that you are striving to treat a fungal infection while the patient is dying of cancer.
I said it’s my ideal of society, not that I’d start collectivizing everything tomorrow! Didn’t you link that story, Manna? If you approve of its ideas, then you’re at least partly a socialist too—in my understanding of the term. Also, which problems would you call “cancer”, specifically?
Oh I didn’t mean to imply you would! But surely you would like to move our current society towards that at some (slow or otherwise) rate, or at least learn enough about the world to eventually get a good plan of doing so.
Nearly every human is I think. Socialism and its variants tap into primal parts of our mind and its ethical and political intuitions. And taking seriously most of our stated ethics one is hard pressed to not end up a libertarian or a communist or even a fascist. Fortunately most people don’t think too hard about politics. I don’t want the conversation to go down this path too far though since I fear the word “socialist” is a problematic one.
Specifically the great power structures opposing moves towards your ideal. It almost dosen’t matter which ideal, since those that I see would oppose most change and I have a hard time considering them benevolent. Even milquetoast regular leftism thinks itself fighting a few such forces, and I would actually agree they are there. You don’t need to agree with their bogeyman, surely you see some much more potent forces shaping our world, that don’t seem inherently interested in your ideals, that are far more powerful than.… the writer of a photocopied essay you picked up on the street?
For Moldbug himself points out, since the barrier to entry to writing a online blog is so low, absent other evidence, you should take him precisely as seriously as a person distributing such photocopied essays. How many people have read anything by Moldbug? Of those how many agree? Of those how many are likely to act? What if you take the entire “alternative” or “dissident” or “new” right and add these people together. Do you get million people? Do you even get 100 thousand? And recall these are dissidents! By the very nature of society outcasts, malcontent’s and misfits are attracted to such thinking.
While I have no problem with you reading right wing blogs, even a whole lot of them, since I certainly do, I feel the need to point out, that you cite some pretty obscure ones that even I have heard about let alone followed, dosen’t that perhaps tell you that you may be operating under a distorted view or intuition of how popular these ideas are? By following their links and comment section your brain is tricked into seeing a different reality from the one that exists, take a survey of political opinion into your hands and check the scale of the phenomena you find troubling.
Putting things into perspective, It seems a waste to lose sleep over them, does it not? Many of them are intelligent and consistent, but then so is Will Newsome and I don’t spend much time worrying about everlasting damnation. If you want anything that can be described as “utopian” or “socialist” your work is cut out for you, you should be wondering how to move mountains, not stomp on molehills.
That’s a good comment, thanks. You’ve slightly misunderstood my feelings and my fears, though. I’ll write a proper response.
In brief, I fear alt-right/technocratic ideas not because they’re in any way popular or “viral” at present, but because I have a nasty gut feeling that they—in a limited sense—do reflect “reality” best of all, that by most naive pragmatist reasoning they follow from facts of life, and that more and more people for whom naive reasoning is more important than social conventions will start to adopt such thinking as soon as they’re alerted to its possibility.
And, crucially, in the age of the internet and such, there will be more and more such under-socialized, smart people growing up and thinking more independently—I fear it could be like the spread of simplified Marxism through underdeveloped and humiliated 3rd-world countries, and with worse consequences. See the Alinsky quote above—“revolution and communism have become one”. If rationalism and techno-fascism become “one” like that, the whole world might suffer for it.
I’m following you from your links in “Nerds are Nuts” and I would like to restate your second paragraph to make sure I have your beliefs rights.
The reason the alt-right is scary is not because they are wrong in their beliefs about reality, but because they are correct about the flaws they see in modern-leftism and this makes their proposals all the more dangerous. Just because a doctor can diagnose what ails you, it does not follow that he knows how to treat you. The Alt Right is correct in it’s diagnosis of societal cancers but their proposals look depressingly closer to leeching than to chemo-therapy.
Is this an accurate restatement?
In all frankness, that’s how I bellyfeel it.
What positive beliefs about politics do you have in light of your fear of necromancy and cancer? My intuition says some form of pragmatic Burkean conservatism but I don’t want to typecast you.
Well, I respect Burke a lot, but my true admiration goes out to people like Chesterton (a big fan of the French Revolution) and Kropotkin and Orwell and maybe even the better sort of religious leader, like John Paul II—the ones who realize the power and necessity of ideology-as-faith, but take the best from both its fronts instead of being tied down on one side. In short, I love idealism.
(If forced to pick among today’s widely used labels, though, I’d be OK with “socialist” and not at all with “conservative”.)
I thought about this on and off the rest of yesterday and my belief is that these two statements are key.
What I get from this is the divide between epistemological and instrumental. Using that classic lesswrong framework I’ve come to this as a representation of your views:
In order to understand the world, if you are going to err, err on the side of Cynicism. But, if you are going to live in it and make it better, you have to err on the side of Idealism.
Cynicism is epistemologically useful but instrumentally desctructive (Explained by the fact you agree with alt-right in the pessimistic view of the world and the reasons things are not as good as they could be.)
Idealism is instrumentally useful but epistemologically destructive. (Explained by the fact you regard ideology-as-faith as vitally useful, but that doesn’t make faith true.)
Is this a fair reading?
I struggled with something similar a while ago, and Vladimir_M had a different take.
I really like summarizing to make sure I get things right. Watch as I prove it!
When dealing with real world morality and goal seeking behavior we seem forced to stare in the face the following facts:
We are very biased.
We could be more rational
Our rationality isn’t particularly good at morality.
Complicating this are the following:
Heuristics generally work. How much rationality do you need to out compete moral and procedural heuristics?
Just how rational can we get. Can low IQ people become much more rational, or are we forced to believe in a cognitive and rationality based elite?
Should we trust moral reasoning or heuristics at all?
I’ve seen the following conclusions drawn so far by people who take bias seriously: (There may be more, this is what I’ve encountered. Also the first two are just jokes I couldn’t resist)
Lovecraftian: The world is ruled by evil Gods beyond imagination. I have seen too much! Go back into the warm milkbath of ignorance! Chew your cud you cows and never think of the slaughter to come!
Existientialism: Everything sucks forever but let’s not kill ourselves because it’s better to push a rock up a mountain or something. We can never know anything and nothing can ever mean anything so we should talk about it forever. Give me Tenure! Linkin Park and Tenure!
Moldbuggery: Bias is bad, real fucking bad. The current systems don’t encourage rationality all that much either. Only a cognitive elite can ever become debiased enough to run things and they should only be trusted if we get a system that aligns with the interests of the subjects. (Ex: Aurini, GLADOS, Konkvistador, Moldbug, Nick Land)
[I had a section on Robin Hanson, but I don’t think I understand it well enough to summarize on reflection, so “This Page Left Blank”]
Old Whiggish: We are very biased and ought to trust intuition, tradition and reason roughly in equal measure. We pride reason too much and so people who try to be perfectly rational are worse reasoners than those who allow a little superstition in their life. Our Heuristics are better than we think. If it works, we should keep it even if it isn’t true. (Ex: Taleb, Derbyshire, Burke, Marcus Aurelius. Almost Jonathan Haidt post “The Righteous Mind” but not quite)
Rational Schizophrenia: A pure cynicism about how things are should be combined with an idealism of how to act. [See above for Multithreaded’s advice]
Yudkowskyianism: Bias is very bad but our prospects for debiasing is less pessimistic than either of those make it out to be. Rationality is like marital arts, any can learn to use leverage regardless of cognitive strength. Though there are clear ways in which we fail, now that we have Bayesian Probability theory derived from pure logic we know how to think about these issues. To abuse a CS Lewis quote: “The Way has not been tried and found wanting; it has been found difficult and left untried.” Try it before giving up because something is only undoable until somebody does it. (Ex: Lukeprog, Yudkowsky)
How does that strike you as the current “Rationality landscape?” Again I’m largely new here as a community member so I could be mischaractizing or leaving ideas out.
The first glance, as usual, reveals interesting things about one’s perception:
That’s honestly how I read it at first. Ha.
BTW Konkvistador belongs in better company (nothing against the others); I’ve come to admire him a little bit and think he’s much wiser than other fans of Moldbug.
Oh, and speaking of good company… “pure cynicism about how things are combined with an idealism of how to act”—that sounds like the ethics that Philip K. Dick tenatively proposes in his Exegesis; shit’s fucked, blind semi-conscious evil rules the world, but there’s a Super-Value to being kind and human even in the face of Armageddon.
I asked Konkvistador if he endorsed the Moldbuggery statement in IRC and he liked it. But I think I want to decontextualize the attitudes toward bias and debiasing So I can better fit different authors/posters together. :/
I’ve come up with /fatalism/pessimism/elitism/rational schizophrenia/optimism . With that breakdown I can put Konvistador in the same category with Plato. I love the name rational schizophrenia too much to give it up.
I liked it too, thanks! :)
.
Huh… yeah! I’d sign under that. And, when you phrase it so nicely, I’m sure that a few others here would.
I’d endorse too (with appropriate caveats about what part of the alt-right I struggle to reject), but the meta-ethical point Karmakaiser is making doesn’t help decide what ethical positions to adopt—only what stance one should take towards one’s adopted moral positions.
Also, there’s an interesting writer with agreeable sentiments coming up on my radar after 30 seconds of googling. His name’s Garret Keizer.
http://www.motherjones.com/politics/2005/03/left-right-wrong
Shit, I’d better start reading this guy!
I see, so this is why you seem to often bring up such discussion on LessWrong? Because you see it as a repository of smart, under-socialized, independent thinkers? I do to a certain extent and in this light, your most recent writing appears much more targeted rather than a overblown obsession.
Do you think this might already be happening? The naive social conventions ignoring utilitarianism we often find ourselves disagreeing with seems to be remarkably widespread among baseline LessWrongers. One merely needs to point out the “techno-facist” means and how well they might work and I can easily see well over a third embracing them, and even more, should criticism of “Cathedral” economic and political theory become better understood and more widespread.
But again remember the “alternative right” has plenty of anti-epistemology and mysticism springing from a fascination with old fascist and to a lesser extent new left intellectuals, this will I think restrain them from fully coalescing around the essentially materialist ethos that you accurately detect is sometimes present.
And even if some of this does happen either from the new right people or from “rationalists” and the cognitive elite, tell me honestly would such a regime and civilization have better or worse odds at creating FAI or surviving existential risk than our own?
But recall what Vladimir_M pointed out, in order to gain economic or political power one must in the age of the internet be more conformist than before, because any transgression is one google search away. Doesn’t this suggest there will be some stability in the social order for the foreseeable future? Or that if change does happen it will only occur if a new ideal is massively popular so that “everyone” transgresses in its favour. Then punishment via hiring practices, reputation or law becomes ineffective.
Also: a third of LWers embracing technofascism? Is that a reference to a third of angels siding with Lucifer in Paradise Lost? Or was this unintended, a small example of our narrative patterns being very similar from Old Testament to Milton to now?
I’m glad you caught the reference. :)
Surviving existential risk, probably. But, unlike today’s inefficient corrupt narrow-minded liberal oligarchy, such a regime would—precisely because of its strengths and the virtues of people who’d rise to the top of it (like objectivity, dislike of a “narrative” approach to life and a cynical understanding of society) - be able to make life hardly worth living for people like us. I don’t know whether the decrease in extinction risk is worth the vastly increased probability of stable and thriving dystopia, where a small managerial caste is unrestrained and unchallenged. Again, democracy and other such modern institutions, pathetic and stupid as they might be from an absolute standpoint, at least prevent real momentous change.
And their “F”AI could well implement many things we’d find awful and dystopian, too (e.g., again, a clean ordered society where slavery is allowed and even children are legally chattel slaves of their parents, to be molded and used freely) - unlike something like this happening with our present-day CEV, it’d be a feature, not a bug. In short, it’s likely a babyeater invasion in essense.
(more coming)
I want to hear more about the Moldbuggian dystopia. Should make excellent SF.
I’m writing it! In Russian, though.
I think your idea that for people’s lives to be worth living they need to have certain beliefs is one of your ugliest recurring themes.
I’m a moral anti-realist through and through, despite believing in god(s). I judge everyone and their lives from my own standpoint. Hell, a good citizen of the Third Reich might’ve found my own life pointless and unworthy of being. Good thing that he’s shot or burnt, then. There’s no neutral evaluation.
You sound like a subjectivist moral realist.
Possibly even what we tend to call “subjectively objective” (I think we should borrow a turn of phrase from Epistemology and just call it subject-sensitive invariantism).
You don’t sound like a moral anti-realist at all.
Keep in mind that while every improvement is a change, most potential changes are not improvements and for most ideals, attempting to implement them leads to total disaster.
Yep. Both him and me have stressed the first half of that several times in one form or other. However, it’s nonsense to say that trying to implement ideals is bad, period, because the problem here is that humans are very bad at some things that would be excellent in themselves—like a benevolent dictatorship. If, for example, we had some way to guarantee that one would stay benevolent, then clearly all other political systems should get the axe—to an utilitarian, there’s no more justification for their evils! But in reality attempts at one usually end in tears.
However, trying to, say, keep one’s desk neat & organized is also an ideal, yet many people, particularly those with OCD, are quite good at implementing it. It is clear, then, that whatever we do, we should first look to psychological realities, and manipulate ourselves in such a way that they assist our stated goals or just don’t hinder them as much.