That’s the kind of probability I would’ve assigned to EURISKO destroying the world back when Lenat was the first person ever to try to build anything self-improving. For a random guy on the Internet it’s off by… maybe five orders of magnitude? I would expect a pretty tiny fraction of all worlds to have the names of homebrew projects carved on their tombstones, and there are many random people on the Internet claiming to have AGI.
People like this are significant, not because of their chances of creating AGI, but because of what their inability to stop or take any serious precautions, despite their belief that they are about to create AGI, tells us about human nature.
Understanding “random guy on the Internet” to mean something like an Internet user all I know about whom is that they are interested in building AGI and willing to put some concerted effort into the project… hrm… yeah, I’ll accept e-7 as within my range.
My estimate for an actual random person on the Internet building AGI in, say, the next decade, has a ceiling of e-10 or so, but I don’t have a clue what its lower bound is.
That said, I’m not sure how well-correlated the willingness of a “random guy on the Internet” (meaning 1) to try to build AGI without taking precautions is to the willingness of someone whose chances are orders of magnitude higher to do so.
Then again, we have more compelling lines of evidence leading us to expect humans to not take precautions.
My estimate for an actual random person on the Internet building AGI in, say, the next decade, has a ceiling of e-10 or so, but I don’t have a clue what its lower bound is.
(I had to read that three times before getting why that number was 1000 times smaller than the other one, because I kept on misinterpreting “random person”. Try “randomly-chosen person”.)
I have no idea what you understood “random person” to mean, if not randomly chosen person. I’m also curious now as to whether whatever-that-is is what EY meant in the first place.
A stranger, esp. one behaving in weird ways; this appears to me to be the most common meaning of that word in 21st-century English when applied to a person. (Older speakers might be unfamiliar with it, but the median LWer is 25 years old, as of the latest survey.) And I also had taken the indefinite article to be an existential quantifier; hence, I had effectively interpreted the statement as “at least one actual strange person on the Internet building AGI in the next decade”, for which I thought such a low probability would be ridiculous.
but because of what their inability to stop or take any serious precautions, despite their belief that they are about to create AGI, tells us about human nature.
Are these in any way a representative sample of normal humans? In order to be in this category one generally needs to be pretty high on the crank scale along with some healthy Dunning-Kruger issues.
That’s always been the argument that future AGI scientists won’t be as crazy as the lunatics presently doing it—that the current crowd of researchers are self-selected for incaution—but I wouldn’t put too much weight on that; it seems like a very human behavior, some of the smarter ones with millions of dollars don’t seem of below-average competence in any other way, and the VCs funding them are similarly incapable of backing off even when they say they expect human-level AGI to be created.
Sorry, I’m confused. By “people like this” did you mean people like FinalState or did you mean professional AI researchers? I interpreted it as the first.
Before people downvote PM’s comment above, note that Eliezer’s comment prior to editing was a hierarchy of different AI researchers with lowest being people like FinalState, the second highest being professional AI researchers and the highest being “top AI researchers”.
With that out of the way, what do you think you are accomplishing with this remark? You have a variety of valid points to make, but I fail to see what is contained in this remark that does anything at all.
Me or Eliezer? I’m making some point by direct demonstration. It’s a popular ranking system, ya know? He used it on FinalState. A lot of people use it on him.
That’s the kind of probability I would’ve assigned to EURISKO destroying the world back when Lenat was the first person ever to try to build anything self-improving. For a random guy on the Internet it’s off by… maybe five orders of magnitude? I would expect a pretty tiny fraction of all worlds to have the names of homebrew projects carved on their tombstones, and there are many random people on the Internet claiming to have AGI.
People like this are significant, not because of their chances of creating AGI, but because of what their inability to stop or take any serious precautions, despite their belief that they are about to create AGI, tells us about human nature.
Understanding “random guy on the Internet” to mean something like an Internet user all I know about whom is that they are interested in building AGI and willing to put some concerted effort into the project… hrm… yeah, I’ll accept e-7 as within my range.
My estimate for an actual random person on the Internet building AGI in, say, the next decade, has a ceiling of e-10 or so, but I don’t have a clue what its lower bound is.
That said, I’m not sure how well-correlated the willingness of a “random guy on the Internet” (meaning 1) to try to build AGI without taking precautions is to the willingness of someone whose chances are orders of magnitude higher to do so.
Then again, we have more compelling lines of evidence leading us to expect humans to not take precautions.
(I had to read that three times before getting why that number was 1000 times smaller than the other one, because I kept on misinterpreting “random person”. Try “randomly-chosen person”.)
I have no idea what you understood “random person” to mean, if not randomly chosen person. I’m also curious now as to whether whatever-that-is is what EY meant in the first place.
A stranger, esp. one behaving in weird ways; this appears to me to be the most common meaning of that word in 21st-century English when applied to a person. (Older speakers might be unfamiliar with it, but the median LWer is 25 years old, as of the latest survey.) And I also had taken the indefinite article to be an existential quantifier; hence, I had effectively interpreted the statement as “at least one actual strange person on the Internet building AGI in the next decade”, for which I thought such a low probability would be ridiculous.
Thanks for clarifying.
Are these in any way a representative sample of normal humans? In order to be in this category one generally needs to be pretty high on the crank scale along with some healthy Dunning-Kruger issues.
That’s always been the argument that future AGI scientists won’t be as crazy as the lunatics presently doing it—that the current crowd of researchers are self-selected for incaution—but I wouldn’t put too much weight on that; it seems like a very human behavior, some of the smarter ones with millions of dollars don’t seem of below-average competence in any other way, and the VCs funding them are similarly incapable of backing off even when they say they expect human-level AGI to be created.
Sorry, I’m confused. By “people like this” did you mean people like FinalState or did you mean professional AI researchers? I interpreted it as the first.
AGI researchers sound a lot like FinalState when they think they’ll have AGI cracked in two years.
Eliezer < anyone with actual notable accomplishments. edit: damn it you edited your message.
Over 140 posts and 0 total karma; that’s persistence.
private_messaging says he’s Dmytry, who has positive karma. It’s possible that the more anonymous-sounding name encourages worse behaviour though.
Before people downvote PM’s comment above, note that Eliezer’s comment prior to editing was a hierarchy of different AI researchers with lowest being people like FinalState, the second highest being professional AI researchers and the highest being “top AI researchers”.
With that out of the way, what do you think you are accomplishing with this remark? You have a variety of valid points to make, but I fail to see what is contained in this remark that does anything at all.
Me or Eliezer? I’m making some point by direct demonstration. It’s a popular ranking system, ya know? He used it on FinalState. A lot of people use it on him.
There’s got to be a level beyond “arguments as soldiers” to describe your current approach to ineffective contrarianism.
I volunteer “arguments as cannon fodder.”