That was an alternate universe. As this post was heavily downvoted, hardly any LWers took the challenge, depriving SIAI of the money they’d have gotten by referring successful challengers. Also, because of information cascades, the same thing happened to all future Quixey posts, leading Quixey to eventually stop posting here. Because of the negative word-of-mouth from such incidents, people stopped looking at the LW audience as a set of eyeballs to monetise.
Consequently, the SIAI was deprived of all potential advertising income, and lacked the budget to perfect FAI theory in time. Meanwhile, the Chinese government, after decadesofeffort, managed to develop a uFAI. Vishwabandhu Gupta of India managed to convince his countrymen that an AI is some sort of intelligence-enhancing ayurvedic wonder-drug that the Chinese had illegally patented. Consequently, the Indians eagerly invaded China, believing that increased intelligence would allow their kids to get into good colleges. This localised conflict blew up into the Artilect War, which killed everyone in the planet.
So please… don’t do that again. Just don’t. I’m tired of having to travel to an alternate universe every time that happens.
By not wanting advertising on LW, I have doomed humanity? Your sense of perspective is troubling. (You should also be ashamed of the narrative fallacy that follows.)
If the LW community’s votes are being overridden somehow, I would at least like the LW editors to be honest about it.
Because, clearly, it is impossible for something as huge as millions of lives to depend on a an Art academy’s decision.
Imagine, with every rejection letter the dean of admissions sends out, he has a brief moment of worry: “is this letter going to put someone on the path to becoming a mass murderer?” His sense of perspective would also be troubling, as his ability to predict the difference acceptance will have on his students’ lives is insufficient to fruitfully worry about those sorts of events. It’s not a statement of impossibility, it’s a statement of improbability. Giving undue weight to the example of Hitler is availability bias.
O RLY?
Yes, really. I presume you’ve read about fictional evidence and the conjunction fallacy? If you want to argue that LW’s eyeballs should be monetized, argue that directly! We’ll have an interesting discussion out in the open. But assuming that LW’s eyeballs should be monetized because you can construct a story in which a few dollars makes the difference between the SIAI succeeding and failing is not rational discourse. Put probabilities on things, talk about values, and we’ll do some calculations.
But assuming that LW’s eyeballs should be monetized because you can construct a story in which a few dollars makes the difference between the SIAI succeeding and failing is not rational discourse.
I’d have thought that the story being as far-fetched and ludicrous as it is would’ve made it obvious that I was just fooling around, not making an argument. Apparently that’s not actually the case.
My apologies if I accidentally managed to convince someone of the necessity of monetizing LW’s eyeballs.
That was an alternate universe. As this post was heavily downvoted, hardly any LWers took the challenge, depriving SIAI of the money they’d have gotten by referring successful challengers. Also, because of information cascades, the same thing happened to all future Quixey posts, leading Quixey to eventually stop posting here. Because of the negative word-of-mouth from such incidents, people stopped looking at the LW audience as a set of eyeballs to monetise.
Consequently, the SIAI was deprived of all potential advertising income, and lacked the budget to perfect FAI theory in time. Meanwhile, the Chinese government, after decades of effort, managed to develop a uFAI. Vishwabandhu Gupta of India managed to convince his countrymen that an AI is some sort of intelligence-enhancing ayurvedic wonder-drug that the Chinese had illegally patented. Consequently, the Indians eagerly invaded China, believing that increased intelligence would allow their kids to get into good colleges. This localised conflict blew up into the Artilect War, which killed everyone in the planet.
So please… don’t do that again. Just don’t. I’m tired of having to travel to an alternate universe every time that happens.
By not wanting advertising on LW, I have doomed humanity? Your sense of perspective is troubling. (You should also be ashamed of the narrative fallacy that follows.)
If the LW community’s votes are being overridden somehow, I would at least like the LW editors to be honest about it.
Because, clearly, it is impossible for something as huge as millions of lives to depend on a an Art academy’s decision.
O RLY?
Imagine, with every rejection letter the dean of admissions sends out, he has a brief moment of worry: “is this letter going to put someone on the path to becoming a mass murderer?” His sense of perspective would also be troubling, as his ability to predict the difference acceptance will have on his students’ lives is insufficient to fruitfully worry about those sorts of events. It’s not a statement of impossibility, it’s a statement of improbability. Giving undue weight to the example of Hitler is availability bias.
Yes, really. I presume you’ve read about fictional evidence and the conjunction fallacy? If you want to argue that LW’s eyeballs should be monetized, argue that directly! We’ll have an interesting discussion out in the open. But assuming that LW’s eyeballs should be monetized because you can construct a story in which a few dollars makes the difference between the SIAI succeeding and failing is not rational discourse. Put probabilities on things, talk about values, and we’ll do some calculations.
I’d have thought that the story being as far-fetched and ludicrous as it is would’ve made it obvious that I was just fooling around, not making an argument. Apparently that’s not actually the case.
My apologies if I accidentally managed to convince someone of the necessity of monetizing LW’s eyeballs.
I completely misunderstood your post, then. My apologies as well.