For example, my opinion about IQ and ethnicity is that the obvious group differences seem to obviously suggest some kind of genetic difference, but I know psychologists have some complicated statistical argument for why that may not be the case, so therefore I don’t know.
Yeah this seems fair.
Is this different for you? How do you end up in discussions about it with people who will then be offended when you say your opinion?
In the past I have actively brought up political topics to discuss with my close circle of trust. If you discuss enough political topics you can easily end up hitting on this particular topic (IQ and group differences). I have distanced from people over similar topics though not this exact one. I can imagine the stakes being much higher once I am in a position of influence (which I aspire to be in).
Multiple such experiences are part of what made me realise there are pros and cons to having even an innocent discussion with your closest friend.
I am curious how you navigate this in discussions with people close to you.
unless your life depends on the approval of others in a somewhat atypical way. (Perhaps it does, if your life involves being famous.)
Most big ways of influencing the world route through acquiring approval of others. Maybe not 100 million people, but atleast 1000 people.
I know little about biosecurity and I don’t intend my remarks to extend to “infohazard” kinds of information.
Yes my point in bringing up that example was infohazardous information. And not just, say knowing DNA sequences of unreleased pathogens worse than covid, but lots of lower stakes information like knowledge of protocols and operating equipment, knowing how to procure cultures and equipment anonymously, knowing who knows what inside the biosecurity world, etc. Even one sufficiently agentic and trusted PhD going rogue can cause meaningful damage IMO.
The resulting secrecy-focussed culture has implications for the personality traits of the senior people in the space, how big their circles of trust are, how they look at and treat other people, and so on. (I don’t know as much as I’d like about what the implications are, but I know they’re non-trivial.)
Also, not everyone who is quite open in the biosecurity world should necessarily be as open as they are, that is a whole another discussion. It’s not obvious to me anyone has sufficiently figured this stuff out to conclusively say that for biosecurity, openness policy X is Good and Y is Bad, end of discussion. Which is why I want to discuss it.
that some EA grantmakers like SFF consider, and I don’t see why being the kind of person who speaks their mind about controversial beliefs would make them less likely to fund you.
I think this is generally true, UNTIL you hit one of the big red flags they have secretly written down in their google doc.
(Or them just not liking you as a person; EA leadership sometimes claims to be high-trust which means its implicit reliance on “vibes” and friends of friends as a short circuit for trust is non-zero. But this is a less important point so I won’t argue it a lot.)
More importantly though, doing important stuff in the world requires gaining approval of more people than just SFF grantmakers.
I guess this comes back to my earlier point about—do you want to blindly execute the implications of whatever culture SFF grantmakers want to propagate (which is ultimately traceable back to Yudkowsky in the year 2000) or do you want to create your own culture? Culture is billion-dimensional.
This is kinda vague and I haven’t explained it very well, but it is something I think about a lot.
Yeah this seems fair.
In the past I have actively brought up political topics to discuss with my close circle of trust. If you discuss enough political topics you can easily end up hitting on this particular topic (IQ and group differences). I have distanced from people over similar topics though not this exact one. I can imagine the stakes being much higher once I am in a position of influence (which I aspire to be in).
Multiple such experiences are part of what made me realise there are pros and cons to having even an innocent discussion with your closest friend.
I am curious how you navigate this in discussions with people close to you.
Most big ways of influencing the world route through acquiring approval of others. Maybe not 100 million people, but atleast 1000 people.
Yes my point in bringing up that example was infohazardous information. And not just, say knowing DNA sequences of unreleased pathogens worse than covid, but lots of lower stakes information like knowledge of protocols and operating equipment, knowing how to procure cultures and equipment anonymously, knowing who knows what inside the biosecurity world, etc. Even one sufficiently agentic and trusted PhD going rogue can cause meaningful damage IMO.
The resulting secrecy-focussed culture has implications for the personality traits of the senior people in the space, how big their circles of trust are, how they look at and treat other people, and so on. (I don’t know as much as I’d like about what the implications are, but I know they’re non-trivial.)
Also, not everyone who is quite open in the biosecurity world should necessarily be as open as they are, that is a whole another discussion. It’s not obvious to me anyone has sufficiently figured this stuff out to conclusively say that for biosecurity, openness policy X is Good and Y is Bad, end of discussion. Which is why I want to discuss it.
I think this is generally true, UNTIL you hit one of the big red flags they have secretly written down in their google doc.
(Or them just not liking you as a person; EA leadership sometimes claims to be high-trust which means its implicit reliance on “vibes” and friends of friends as a short circuit for trust is non-zero. But this is a less important point so I won’t argue it a lot.)
More importantly though, doing important stuff in the world requires gaining approval of more people than just SFF grantmakers.
I guess this comes back to my earlier point about—do you want to blindly execute the implications of whatever culture SFF grantmakers want to propagate (which is ultimately traceable back to Yudkowsky in the year 2000) or do you want to create your own culture? Culture is billion-dimensional.
This is kinda vague and I haven’t explained it very well, but it is something I think about a lot.