Secrecy is the exception. Mostly no one cares about your startup idea or will remember your hazardous brainstorm, no one is going to cause you trouble, and so on, and honesty is almost always the best policy.
That doesn’t mean always tell everyone everything, but you need to know what you are worried about if you are letting this block you.
On infohazards, I think people were far too worried for far too long. The actual dangerous idea turned out to be that AGI was a dangerous idea, not any specific thing. There are exceptions, but you need a very good reason, and an even better reason if it is an individual you are talking with.
Trust in terms of ‘they won’t steal from me’ or ‘they will do what they promise’ is another question with no easy answers.
If you are planning something radical enough to actually get people’s attention (e.g. breaking laws, using violence, fraud of various kinds, etc) then you would want to be a lot more careful who you tell, but also—don’t do that?
I agree with this if you do not have any significant power or influence. As of now I mostly worried about making wrong choices that will really hurt me (and the world) years from now in the best case scenario where I do have a lot of impact.
For instance nick bostroms email which leaked years later. Or lots of moral and epistemically assumptions made by the original EA and Bay Area crowd that still causes people to sometimes make the world a more uncertain place. (I’m not saying uncertainty is bad but it’s very often the case imo that well intentioned people cause harm inside EA)
The actual dangerous idea turned out to be that AGI was a dangerous idea, not any specific thing.
I agree popularing the idea of AGI itself was dangerous, it’s possible if yudkowsky had kept quiet on the extropians mailing list and disappeared to the woods instead , deepmind and OpenAI would not exist today.
My worry is this also applies to literally every other technology on the extropians mailing list or anything similar, be it sulfur geoengineering or nanotech or gene drives or engineering microbiomes or nonlethal incapacitation agents or atleast a 10 other things. I see designing good culture and institutions for all this as massively unsolved.
If you are planning something radical enough to actually get people’s attention (e.g. breaking laws, using violence, fraud of various kinds, etc) then you would want to be a lot more careful who you tell, but also—don’t do that?
Would you consider it breaking the law to train gpt4 on copyrighted text? (Or any of the N number of gpt4 startups also crawling the web right now) What about Satoshi starting cryptocurrency? What about gpt4 email spam? What about writing a doxxing tool? What about starting a dating app to get user databases to obtain favour among authoritarian governments? What about a better translate tool that actually permanently kills all language divides on Earth and alters geopolitics as a result? What about working on improving lie detection? What about distributing libertarian ideologies and ham radios and gunpowder in countries that currently do not allow this?
What about starting a gene drive startup that massively hypes the upside and streamrolls safety people the way Sama did for AGI? Like, it is obvious to me if I wanted to become one of the top 10 powerful people in history, this is the only move, and the only real excuse I have for not doing it is that I am not willing to work that insanely hard. (Plus vague “vibes” that if I don’t maintain complete control and someone else gets control or there’s a geopolitical arms race to stockpile gene drives, the world for me could end up worse than if I had never started the startup at all.)
These are just a bunch of live threads in my mind right now. Bullet645 having researched any one of these ideas for 2 years looks more dangerous than bullet645 just having mentioned them in passing like I did just now.
If you’ve read my comments on this post and still think it’s basically fine to discuss whatever I want low filter with a close circle, I would like to know your reasoning. I’m happy discussing on email as well.
As an aside, I would love if you have input on how I can get more people to engage with my post. I’ve cold emailed it to over a hundred people in EA/rationality but I suspect my framing is not good if most people ignored it.
(I usually don’t cold email unless something is important, most of these people I have not written to in atleast a year.)
I think it’s a norm if you’re bidding for specific attention, yeah. Like, you should either do more work to figure out a smaller set of people who you want to bid for attention from, or else at least say that you haven’t done that work.
Honestly I don’t know how to narrow it down since I’m mostly asking for help from strangers in EA/rat circles and I don’t know in advance who actually has time/energy/interest for this right now. Maybe I can make some guesses but it would’ve made my mailing list much smaller without much gain.
I have spoken to people in EA/rationality who are close to me but this hasn’t fixed my problems yet. It’s easier to be more blunt when anonymous and talking to strangers.
Secrecy is the exception. Mostly no one cares about your startup idea or will remember your hazardous brainstorm, no one is going to cause you trouble, and so on, and honesty is almost always the best policy.
That doesn’t mean always tell everyone everything, but you need to know what you are worried about if you are letting this block you.
On infohazards, I think people were far too worried for far too long. The actual dangerous idea turned out to be that AGI was a dangerous idea, not any specific thing. There are exceptions, but you need a very good reason, and an even better reason if it is an individual you are talking with.
Trust in terms of ‘they won’t steal from me’ or ‘they will do what they promise’ is another question with no easy answers.
If you are planning something radical enough to actually get people’s attention (e.g. breaking laws, using violence, fraud of various kinds, etc) then you would want to be a lot more careful who you tell, but also—don’t do that?
I agree with this if you do not have any significant power or influence. As of now I mostly worried about making wrong choices that will really hurt me (and the world) years from now in the best case scenario where I do have a lot of impact.
For instance nick bostroms email which leaked years later. Or lots of moral and epistemically assumptions made by the original EA and Bay Area crowd that still causes people to sometimes make the world a more uncertain place. (I’m not saying uncertainty is bad but it’s very often the case imo that well intentioned people cause harm inside EA)
I agree popularing the idea of AGI itself was dangerous, it’s possible if yudkowsky had kept quiet on the extropians mailing list and disappeared to the woods instead , deepmind and OpenAI would not exist today.
My worry is this also applies to literally every other technology on the extropians mailing list or anything similar, be it sulfur geoengineering or nanotech or gene drives or engineering microbiomes or nonlethal incapacitation agents or atleast a 10 other things. I see designing good culture and institutions for all this as massively unsolved.
Would you consider it breaking the law to train gpt4 on copyrighted text? (Or any of the N number of gpt4 startups also crawling the web right now) What about Satoshi starting cryptocurrency? What about gpt4 email spam? What about writing a doxxing tool? What about starting a dating app to get user databases to obtain favour among authoritarian governments? What about a better translate tool that actually permanently kills all language divides on Earth and alters geopolitics as a result? What about working on improving lie detection? What about distributing libertarian ideologies and ham radios and gunpowder in countries that currently do not allow this?
What about starting a gene drive startup that massively hypes the upside and streamrolls safety people the way Sama did for AGI? Like, it is obvious to me if I wanted to become one of the top 10 powerful people in history, this is the only move, and the only real excuse I have for not doing it is that I am not willing to work that insanely hard. (Plus vague “vibes” that if I don’t maintain complete control and someone else gets control or there’s a geopolitical arms race to stockpile gene drives, the world for me could end up worse than if I had never started the startup at all.)
These are just a bunch of live threads in my mind right now. Bullet645 having researched any one of these ideas for 2 years looks more dangerous than bullet645 just having mentioned them in passing like I did just now.
If you’ve read my comments on this post and still think it’s basically fine to discuss whatever I want low filter with a close circle, I would like to know your reasoning. I’m happy discussing on email as well.
As an aside, I would love if you have input on how I can get more people to engage with my post. I’ve cold emailed it to over a hundred people in EA/rationality but I suspect my framing is not good if most people ignored it.
(I usually don’t cold email unless something is important, most of these people I have not written to in atleast a year.)
I think it’s pretty defecty to email a lot of people without saying in the email that you’ve done so.
Oh—is this is a norm? I’ve received many mass emails but never a mass email telling me it is a mass email or who else has received it.
I’ll try to keep this in mind.
I think it’s a norm if you’re bidding for specific attention, yeah. Like, you should either do more work to figure out a smaller set of people who you want to bid for attention from, or else at least say that you haven’t done that work.
Thanks. Will keep in mind next time.
Honestly I don’t know how to narrow it down since I’m mostly asking for help from strangers in EA/rat circles and I don’t know in advance who actually has time/energy/interest for this right now. Maybe I can make some guesses but it would’ve made my mailing list much smaller without much gain.
I have spoken to people in EA/rationality who are close to me but this hasn’t fixed my problems yet. It’s easier to be more blunt when anonymous and talking to strangers.