The problem is basically our machinery for emergencies are entirely broken. More specifically, our emergency scale is now years or decades into a 10x-100x industrial revolution, while our fight or flight reactions are at best only useful for days scale emergencies like stress.
That’s not really a problem. It’s a parameter. What this means is that you can’t functionally use the fight-or-flight system to orient to long-term emergencies, except on the (rare) occasions when you in fact see some pivotal thing on the timescale of seconds or minutes that actually makes a big difference.
…which, corollary, means that if you’re freaking out about something on timescales longer than that, you haven’t really integrated an understanding of the whole context deeply enough, and/or you’re projecting an immediate threat on a situation that probably isn’t actually threatening (i.e., trauma triggers).
I was talking about how we naturally deal with an emergency, and why our evolved ways to deal with it are so off distribution that it entirely harms us. The basic problem is that in the EEA, the longest emergencies lasted were days; you were dead or you more or less completely recovered by then, and often emergencies resolved in minutes or seconds.
Now we have to deal with an abstract emergency that is only doable by experimenting, because the abstract, first principles way is entirely evolved for tribalism, not truth, and given the likely takeoff is slow, years-long emergencies have to be dealt with continuously in AI X-risk.
This is entirely off-distribution for our evolved machinery for emergencies, so that’s why I made that comment.
Yep, I think I understood. I thought your comment made sense and was worth saying.
I think my phrasing came across funny over text. Reading it back, it sounds more dismissive than I meant it to be.
I do suspect we disagree about a subtle point. I don’t think our evolved toolkit works against us here. The problem (as I see it) is upstream of that. We’re trying to treat things that last for more than days as “emergencies”, thus inappropriately applying the evolved emergency toolkit in situations it doesn’t work well for.
I mean, if I want to visit a friend who’s half a mile away, I might just walk. If I want to visit a friend who’s across the country, I’m not even tempted to think the answer is to start walking in their direction. This is a case where understanding the full context means that my evolved instincts (in this case for traveling everywhere via walking) help instead of creating a problem: I walk to my computer to buy a plane ticket, then to my car to drive to the airport, etc.
We haven’t worked out the same cultural machinery for long timescale emergencies yet.
And you’re quite right, given this situation our instincts are really terribly set up for handling it.
But that’s what psychotechnologies are for. Things like language and mathematics help to extend our instinctual basis so we can work with things way, way beyond the scope of what our evolved capacities ever had to handle.
We just haven’t developed adequate psychotech here just yet.
Yeah, that’s a crux for me. Essentially, evolution suffered an extremal Goodhart problem, where taking the naturally evolved mechanisms for emergencies out of it’s EEA distribution leads to weird and bad outcomes.
My point is genetics and evolution matter a lot, much more than a lot of self-help and blank slate views tend to give, which is why I give primacy to genetic issues. So psychotechnologies and new moral systems are facing up against a very powerful optimizer, genes and evolution and usually the latter wins.
The basic problem is that in the EEA, the longest emergencies lasted were days;
Is that actually the case? Something like a famine seems like it could last longer and be easily caused by unfavorable seasonal conditions. More social kinds of emergencies like “a competing faction within your tribe starting to gain power” also seem like they would be more long-lasting. Also e.g. if a significant part of your tribesmen happened to get sick or badly wounded around the same time.
I’ll somewhat concede here that such things can be dealt with. On the tribal example, well our big brains are basically geared to tribal politics for this exact reason.
The problem is basically our machinery for emergencies are entirely broken. More specifically, our emergency scale is now years or decades into a 10x-100x industrial revolution, while our fight or flight reactions are at best only useful for days scale emergencies like stress.
That’s not really a problem. It’s a parameter. What this means is that you can’t functionally use the fight-or-flight system to orient to long-term emergencies, except on the (rare) occasions when you in fact see some pivotal thing on the timescale of seconds or minutes that actually makes a big difference.
…which, corollary, means that if you’re freaking out about something on timescales longer than that, you haven’t really integrated an understanding of the whole context deeply enough, and/or you’re projecting an immediate threat on a situation that probably isn’t actually threatening (i.e., trauma triggers).
I was talking about how we naturally deal with an emergency, and why our evolved ways to deal with it are so off distribution that it entirely harms us. The basic problem is that in the EEA, the longest emergencies lasted were days; you were dead or you more or less completely recovered by then, and often emergencies resolved in minutes or seconds.
Now we have to deal with an abstract emergency that is only doable by experimenting, because the abstract, first principles way is entirely evolved for tribalism, not truth, and given the likely takeoff is slow, years-long emergencies have to be dealt with continuously in AI X-risk.
This is entirely off-distribution for our evolved machinery for emergencies, so that’s why I made that comment.
Yep, I think I understood. I thought your comment made sense and was worth saying.
I think my phrasing came across funny over text. Reading it back, it sounds more dismissive than I meant it to be.
I do suspect we disagree about a subtle point. I don’t think our evolved toolkit works against us here. The problem (as I see it) is upstream of that. We’re trying to treat things that last for more than days as “emergencies”, thus inappropriately applying the evolved emergency toolkit in situations it doesn’t work well for.
I mean, if I want to visit a friend who’s half a mile away, I might just walk. If I want to visit a friend who’s across the country, I’m not even tempted to think the answer is to start walking in their direction. This is a case where understanding the full context means that my evolved instincts (in this case for traveling everywhere via walking) help instead of creating a problem: I walk to my computer to buy a plane ticket, then to my car to drive to the airport, etc.
We haven’t worked out the same cultural machinery for long timescale emergencies yet.
And you’re quite right, given this situation our instincts are really terribly set up for handling it.
But that’s what psychotechnologies are for. Things like language and mathematics help to extend our instinctual basis so we can work with things way, way beyond the scope of what our evolved capacities ever had to handle.
We just haven’t developed adequate psychotech here just yet.
(And in this particular case I think current unaligned superintelligences want us to stay scared and confused, so inventing that psychotech is currently an adversarial process — but that’s not a crux for my point here.)
Yeah, that’s a crux for me. Essentially, evolution suffered an extremal Goodhart problem, where taking the naturally evolved mechanisms for emergencies out of it’s EEA distribution leads to weird and bad outcomes.
My point is genetics and evolution matter a lot, much more than a lot of self-help and blank slate views tend to give, which is why I give primacy to genetic issues. So psychotechnologies and new moral systems are facing up against a very powerful optimizer, genes and evolution and usually the latter wins.
Is that actually the case? Something like a famine seems like it could last longer and be easily caused by unfavorable seasonal conditions. More social kinds of emergencies like “a competing faction within your tribe starting to gain power” also seem like they would be more long-lasting. Also e.g. if a significant part of your tribesmen happened to get sick or badly wounded around the same time.
I’ll somewhat concede here that such things can be dealt with. On the tribal example, well our big brains are basically geared to tribal politics for this exact reason.