Classified information about supposedly leaked classified information doesn’t seem very credible. If you can’t spill the beans on your sources, why say anything? It just seems like baseless mud-slinging against a perceived competitor.
Note that this has, historically, been a bit of a problem with MIRI. Lots of teams race to create superintelligence. MIRI’s strategy seems to include liberal baseless insinuations that their competiors are going to destroy the world. Consider the “If Novamente should ever cross the finish line, we all die” case. Do you folk really want to get a reputation for mudslinging—and slagging off competitors? Do you think that looks “friendly”?
In I.T., focusing on your competitors’ flaws is known as F.U.D.. I would council taking care when using F.U.D. tactics in public.
Classified information about supposedly leaked classified information doesn’t seem very credible.
It’s not that classified, if you know people from Google who engage with Google’s TGIFs.
MIRI’s strategy seems to include liberal baseless insinuations that their competiors are going to destroy the world. Consider the “If Novamente should ever cross the finish line, we all die” case.
There’s nothing special about Goertzel here, and I don’t think you can pretend you don’t know that. We’re just saying that AGI is an incredibly powerful weapon, and FAI is incredibly difficult. As for “baseless”, well… we’ve spent hundreds of pages arguing this view, and an even better 400-page summary of the arguments is forthcoming in Bostrom’s Superintelligence book.
It’s not mudslinging, it’s Leo Szilard pointing out that nuclear chain reactions have huge destructive potential even if they could also be useful for power plants.
We’re just saying that AGI is an incredibly powerful weapon, and FAI is incredibly difficult. As for “baseless”, well… we’ve spent hundreds of pages arguing this view, and an even better 400-page summary of the arguments is forthcoming in Bostrom’s Superintelligence book.
It’s not mudslinging, it’s Leo Szilard pointing out that nuclear chain reactions have huge destructive potential even if they could also be useful for power plants.
Machine intelligence is important. Who gets to build it using what methodology is also likely to have a significant effect. Similarly, operating systems were important. Their development produced large power concentrations—and a big mountain of F.U.D. from predatory organizations. The outcome set much of the IT industry back many years. I’m not suggesting that the stakes are small.
Classified information about supposedly leaked classified information doesn’t seem very credible. If you can’t spill the beans on your sources, why say anything? It just seems like baseless mud-slinging against a perceived competitor.
Note that this has, historically, been a bit of a problem with MIRI. Lots of teams race to create superintelligence. MIRI’s strategy seems to include liberal baseless insinuations that their competiors are going to destroy the world. Consider the “If Novamente should ever cross the finish line, we all die” case. Do you folk really want to get a reputation for mudslinging—and slagging off competitors? Do you think that looks “friendly”?
In I.T., focusing on your competitors’ flaws is known as F.U.D.. I would council taking care when using F.U.D. tactics in public.
It’s not that classified, if you know people from Google who engage with Google’s TGIFs.
There’s nothing special about Goertzel here, and I don’t think you can pretend you don’t know that. We’re just saying that AGI is an incredibly powerful weapon, and FAI is incredibly difficult. As for “baseless”, well… we’ve spent hundreds of pages arguing this view, and an even better 400-page summary of the arguments is forthcoming in Bostrom’s Superintelligence book.
It’s not mudslinging, it’s Leo Szilard pointing out that nuclear chain reactions have huge destructive potential even if they could also be useful for power plants.
Machine intelligence is important. Who gets to build it using what methodology is also likely to have a significant effect. Similarly, operating systems were important. Their development produced large power concentrations—and a big mountain of F.U.D. from predatory organizations. The outcome set much of the IT industry back many years. I’m not suggesting that the stakes are small.