It seems “taboo” to me. Like, when I go to think about this, I feel … inhibited in some not-very-verbal, not-very-explicit way. Kinda like how I feel if I imagine asking an inane question of a stranger without a socially sensible excuse, or when a clerk asked me why I was buying so many canned goods very early in Covid.
I think we are partly seeing the echoes of a social flinch here, somehow. It bears examining!
Open tolerance of the people involved with status quo and fear of alienating / making enemies of powerful groups is a core part of current EA culture! Steve’s top comment on this post is an example of enforcing/reiterating this norm.
It’s an unwritten rule that seems very strongly enforced yet never really explicitly acknowledged, much less discussed. People were shadow blacklisted by CEA from the Covid documentary they funded for being too disrespectful in their speech re: how governments have handled covid. That fits what I’d consider a taboo, something any socially savvy person would pick up on and internalize if they were around it.
Maybe this norm for open tolerance is downstream of the implications of truly considering some people to be your adversaries (which you might do if you thought delaying AI development by even an hour was a considerable moral victory, as the OP seems to). Doing so does expose you to danger. I would point out that while lc’s post analogizes their relationship with AI researchers to Isreal’s relationship with Iran. When I think of Israel’s resistance to Iran nonviolence is not the first thing that comes to mind.
People were shadow blacklisted by CEA from the Covid documentary they funded for being too disrespectful in their speech re: how governments have handled covid.
I agree. I also think this is a topic that needs to be seriously considered and discussed because not doing so may leave behind a hidden hindrance to accurate collective assessment and planning for AI risks. Because contrary to our conceits and aspirations, our judgements aren’t at all immune to the sway of biases, flawed assumptions, and human emotions. I’m not sure how to put this, but people on this forum don’t come off as very worldly, if that makes sense. A lot of people are in technical professions where understanding of political realities seem to be lacking. The US and China stand to be the two major drivers of AI development in the next decades. Increasingly they don’t see eye to eye, and an arm-race dynamic might develop. So I feel there’s been a lot of focus on the technical/theoretical side of things, but not enough concern over the practical side of development, the geopolitical implications, and all that might entail.
FYI, I thought this sort of idea was an obvious one, and I’ve been continuously surprised that it didn’t have more discussion. I don’t feel inhibited and am sort of surprised you are.
(I do think there’s a lot of ways to do this badly, with costs on the overall coordination-commons, so, maybe I feel somewhat inhibited from actually going off to do the thing. But I don’t feel inhibited from brainstorming potential ways to address the costs and thinking about how to do it)
OK! Well, I can’t speak for everyone’s experiences, only my own. I don’t think this subject should be taboo and I’m glad people are talking more about it now.
It seems “taboo” to me. Like, when I go to think about this, I feel … inhibited in some not-very-verbal, not-very-explicit way. Kinda like how I feel if I imagine asking an inane question of a stranger without a socially sensible excuse, or when a clerk asked me why I was buying so many canned goods very early in Covid.
I think we are partly seeing the echoes of a social flinch here, somehow. It bears examining!
Open tolerance of the people involved with status quo and fear of alienating / making enemies of powerful groups is a core part of current EA culture! Steve’s top comment on this post is an example of enforcing/reiterating this norm.
It’s an unwritten rule that seems very strongly enforced yet never really explicitly acknowledged, much less discussed. People were shadow blacklisted by CEA from the Covid documentary they funded for being too disrespectful in their speech re: how governments have handled covid. That fits what I’d consider a taboo, something any socially savvy person would pick up on and internalize if they were around it.
Maybe this norm for open tolerance is downstream of the implications of truly considering some people to be your adversaries (which you might do if you thought delaying AI development by even an hour was a considerable moral victory, as the OP seems to). Doing so does expose you to danger. I would point out that while lc’s post analogizes their relationship with AI researchers to Isreal’s relationship with Iran. When I think of Israel’s resistance to Iran nonviolence is not the first thing that comes to mind.
???
I agree. I also think this is a topic that needs to be seriously considered and discussed because not doing so may leave behind a hidden hindrance to accurate collective assessment and planning for AI risks. Because contrary to our conceits and aspirations, our judgements aren’t at all immune to the sway of biases, flawed assumptions, and human emotions. I’m not sure how to put this, but people on this forum don’t come off as very worldly, if that makes sense. A lot of people are in technical professions where understanding of political realities seem to be lacking. The US and China stand to be the two major drivers of AI development in the next decades. Increasingly they don’t see eye to eye, and an arm-race dynamic might develop. So I feel there’s been a lot of focus on the technical/theoretical side of things, but not enough concern over the practical side of development, the geopolitical implications, and all that might entail.
FYI, I thought this sort of idea was an obvious one, and I’ve been continuously surprised that it didn’t have more discussion. I don’t feel inhibited and am sort of surprised you are.
(I do think there’s a lot of ways to do this badly, with costs on the overall coordination-commons, so, maybe I feel somewhat inhibited from actually going off to do the thing. But I don’t feel inhibited from brainstorming potential ways to address the costs and thinking about how to do it)
(kinda intrigued by the notion of there being dark-matter taboos)
I feel similarly.
OK! Well, I can’t speak for everyone’s experiences, only my own. I don’t think this subject should be taboo and I’m glad people are talking more about it now.
I also find it somewhat taboo but not so much that I haven’t wondered about it.