That’s why I asked whether Less Wrongers would prefer SI to devote more of it’s time to slowing down other people’s unfriendly AI relative to how much time it spends constructing FAI. I agree, SI staff shouldn’t answer.
I think any sequence of events that leads to anyone at all in any way associated with either lesswrong or SI doing anything to hinder any research would be a catastrophe for this community. At best, you will get a crank label (more than now, that is), at worst the FBI will get involved.
Yes. It’s much better to tile the universe with paperclips than to have this community looked on poorly. How ever could he have gotten his priorities so crossed?
If there is a big enough AI project out there, especially if it will be released as freeware, others won’t work on it. That would be high-risk and result in a low return on investment.
Also, I don’t think my other two risky AGI deterring ideas aren’t do-able simultaneously. Not sure how many people it would take to get those moving on a large enough scale, but it’s probably nowhere near as much as making a friendly AGI.
That’s why I asked whether Less Wrongers would prefer SI to devote more of it’s time to slowing down other people’s unfriendly AI relative to how much time it spends constructing FAI. I agree, SI staff shouldn’t answer.
I think any sequence of events that leads to anyone at all in any way associated with either lesswrong or SI doing anything to hinder any research would be a catastrophe for this community. At best, you will get a crank label (more than now, that is), at worst the FBI will get involved.
I think you may be a bit late.
Yes. It’s much better to tile the universe with paperclips than to have this community looked on poorly. How ever could he have gotten his priorities so crossed?
If there is a big enough AI project out there, especially if it will be released as freeware, others won’t work on it. That would be high-risk and result in a low return on investment.
Three ideas to prevent unfriendly AGI (Scroll to “Help good guys beat the arms race”)
Also, I don’t think my other two risky AGI deterring ideas aren’t do-able simultaneously. Not sure how many people it would take to get those moving on a large enough scale, but it’s probably nowhere near as much as making a friendly AGI.