I run the EA aligned YouTube channel A Happier World: www.youtube.com/ahappierworldyt
My name is pronounced somewhat like ‘yuh-roon’.
Pronouns: he/him
I run the EA aligned YouTube channel A Happier World: www.youtube.com/ahappierworldyt
My name is pronounced somewhat like ‘yuh-roon’.
Pronouns: he/him
While this might be a great way to earn money (assuming competitors won’t invest similarly in AI soon enough), but aren’t there good reasons not to invest in AI capabilities, like reducing P(doom)?
Also I assume it’s wise to mention you’re not a financial adviser and don’t bear responsibility for actions people take because of your comment (same counts for me).
Hey Bruno! I’m an organiser for EA Brussels and would love to collaborate this on (ex. by making a facebook event on the EA Brussels page/group). Would love it if you could reach out to me :)
https://www.facebook.com/jeroen.willems.7528/
or jeroen at eabrussels dot org
Thanks for explaining your thoughts on AI safety, it’s much appreciated.
I think in general when trying to do good in the world, we should strive for actions that have a high expected value and a low potential downside risk.
I can imagine a high expected value case for Anthropic. But I don’t see how Anthropic has few potential downsides. I’m very worried that by participating in the race to AGI, p(doom) might increase.
For an example pointed out in the comments here by habryka:
Could you explain to me why you think there are no large potential downsides to Anthropic? I’m extremely worried the EA/LessWrong community has so far only increased AI risk, and the creation of Anthropic doesn’t exactly soothe these worries.
PS: You recently updated your website and it finally has a lot more information about your company and you also finally have a contact email listed, which is great! But I just wanted to point out that when emailing hello [at] anthropic.com I get an email back saying the address wasn’t found. I’ve tried contacting your company about my worries before, but it seems really difficult to reach you.