No, I haven’t read it yet, but it’s on my list. Here’s another download link http://dl.dropbox.com/u/33627365/Scholarship/Spent%20Sex%20Evolution%20and%20Consumer%20Behavior.pdf
Grognor
Less Wrong on Twitter
http://lesswrong.com/lw/82g/on_the_openness_personality_trait_rationality/ has a download of one book very close to this topicspace.
I wasn’t one of the silent downvoters, but I went ahead and downvoted without being silent because your comment just misunderstands Larks’s. He did not even implicitly claim that there is a creationism tradition in biology, but rather an ongoing, publicized debate between evolution and creationism, which is analogous to analytic vs. continental philosophy, if one is laughably wrong but still famous for whatever reason.
they can’t convert energy (for the care and feeding of humans) into work more efficiently by building robots than through human workers.
The initial idea was that humans are essentially self-sustaining, and the AI would take over the natural environment with humans just like humans did so for the natural environment without humans.
I agree completely but downvoted for ranting with dubious pertinence as an ideologue.
(I upvoted this post because I saw that its score was at −1. I normally avoid doing this, but here I make an exception because, while this is not a quality article, I think this sort of thing should be encouraged here on Less Wrong, the sort of “I have a doubt and I want to know about to do about it” posts that this is an example of.
Saying,
This back-and-forth from certainty to uncertainty makes me feel like I’m doing something seriously wrong.
took at least some courage. Let us not punish that. On the other hand I do feel like this is more appropriate for the open threads, but nobody checks them the week before a new one anyway.)
Rather than needing us, it might be so that self-improvement is costly, and since humans can understand complex commands and can be infected with memes rather easily, an AI may just start a religion or somesuch to turn us all into its willing slaves.
Edit: Why is this being downvoted? I am not saying this would be a good outcome! People who want to “coexist” with superintelligences ought to learn that this could be and probably would be worse than outright destruction.
Edit: Well said, Carl. I thought of it in response to another comment and couldn’t get it out of my head.
Sure, LW could be better, but what are you comparing to? Every time I try to have a conversation outside LW/OB I am slapped in the face by how much worse other communities tend to be.
Yes, Less Wrong is better than all other places. But I hope you will agree that this is not an optimistic prognostication. I do not think we are doing particularly well, if you just look at us and look at how we are doing rather than comparing this place to other places.
I’d like to remind you of some of the words from my favorite essay, which is also one of your favorite essays:
But it is useless to be superior: Life is not graded on a curve. The best physicist in ancient Greece could not calculate the path of a falling apple. There is no guarantee that adequacy is possible given your hardest effort; therefore spare no thought for whether others are doing worse.
I do not think we are doing the best we possibly can, and I think that is very bad.
Yvain, myself, Anna Salamon, and many others have written hundreds of useful and well-liked posts since The Sequences. In what sense is it “Eliezer’s blog”?
I agree, but these are salient exceptions, not the rule. It is “Eliezer’s blog” in the sense that The Sequences are the most important thing here, but people are barely reading them (or so I hear)
It’s also untrue that Eliezer no longer writes updates.
They are so very, very rare, though. And the others you listed, indeed many of the others who made good contributions at all, have all but stopped.
If he existed he would make your life utterly miserable.
Problem solved.
Edit: Even though this is the tersest possible reply I could have given, it is not meant as a dismissal; I really do think this turns the problem into a simple calculation. If creating Bob would make your life bad enough that is more horrible than counterfactually not existing, you are already done.
You could go with what Everett wanted to call it in the first place, the relative state interpretation.
To answer your “Edit” question, no, the relative state interpretation does not include probabilities as fundamental.
“Should I buy now or wait for the new models?” is a refrain so often heard from the panicky first-timer, who forgets that the number of sledgehammer innovations in the last three thousand years can be counted on one finger.
The presupposition is that passing judgment on somebody’s “lifestyle” (for those who do not speak psychobabble, this means the English word behaviors) is an activity which is forbidden. It follows immediately that when the person says to you “Don’t be all judgmental” they are in fact passing judgment on your behavior. In other words, they are “being all judgmental.” It is, therefore, impossible not to pass judgment. I do not mean “impossible” in the colloquial sense of “unlikely”, but in the logical sense of “certainly cannot be no matter what.”
No, it just happened. You’re underestimating the degree to which people can have different aliefs.
Spot the fallacy in:
We should not hit ourselves on the head with hammers, because that would lead to us being in pain.
It’s appeal to consequences, after all. Ooh, or better yet, spot the fallacy in:
Argument from consequences leads to being wrong, and therefore you should not do it.
Interesting reaction. I shall admit that even though Eliezer’s free will sequence was intellectually convincing to me, it did not change my alief that free will just isn’t there and isn’t even a useful allusion. So this is going on my reading list.
I read it more charitably, as being isomorphic to Schopenhauer’s “A man can do as he wills, but not will as he wills.” The idea is that you are feeling something and not something else, and regardless of what you are feeling you can and should do right.