Oh, sorry about that! After this dawned on me, I just kind of skimmed the rest and the subtitle “The China question” did not trigger a blip on my “you must read this before posting that idea” radar.
Convincing programmers might work, if we think very few programmers or AI researchers are the ones making actual progress. Herding programmers is like herding cats, so this works only in proportion to how many key coders there are—if you need to convince more than, say, 100,000, I don’t think it would work.
PR nightmare seems to be the same thing.
Winning the race is a reasonable idea but I’m not sure the dynamic actually works that way: someone wanting to produce and sell an AI period might be discouraged by an open-source AI, but a FLOSS AI would just be catnip to anyone who wants to throw it on a supercomputer and make $$$.
I wish this was on the idea comment rather than over here… I’m sorry but I think I will have to relocate my response to you by putting it on the other thread where my comment is. This is because discussing it here will result in a bunch of people jumping into the conversation on this thread when the comment we’re talking about is on a different thread. So, for the sake of keeping it organized, my response to you regarding the feasibility of convincing programmers to refuse risky AI jobs is on the other thread.
Oh, sorry about that! After this dawned on me, I just kind of skimmed the rest and the subtitle “The China question” did not trigger a blip on my “you must read this before posting that idea” radar.
What did you think of my ideas for slowing Moore’s law?
Patents is a completely unworkable idea.
Convincing programmers might work, if we think very few programmers or AI researchers are the ones making actual progress. Herding programmers is like herding cats, so this works only in proportion to how many key coders there are—if you need to convince more than, say, 100,000, I don’t think it would work.
PR nightmare seems to be the same thing.
Winning the race is a reasonable idea but I’m not sure the dynamic actually works that way: someone wanting to produce and sell an AI period might be discouraged by an open-source AI, but a FLOSS AI would just be catnip to anyone who wants to throw it on a supercomputer and make $$$.
I wish this was on the idea comment rather than over here… I’m sorry but I think I will have to relocate my response to you by putting it on the other thread where my comment is. This is because discussing it here will result in a bunch of people jumping into the conversation on this thread when the comment we’re talking about is on a different thread. So, for the sake of keeping it organized, my response to you regarding the feasibility of convincing programmers to refuse risky AI jobs is on the other thread.