Can somebody explain to me why people generally assume that the great filter has a single cause? My gut says it’s most likely a dozen one-in-a-million chances that all have to turn out just right for intelligent life to colonize the universe. So the total chance would be 1/1000000^12. Yet everyone talks of a single ‘great filter’ and I don’t get why.
Maybe not explicitly, but I keep seeing people refer to “the great filter” as if it was a single thing. But maybe you’re right and I’m reading too much into this.
Normal filters have multiple layers too—for example first you can have the screen that keeps the large particles out. Then you have the activated charcoal, and then the paper to keep the charcoal out, and for high-end filters you finish off with a porous membrane.
And yet, it’s all one filter.
SO… we’re speculating here on where the bulk of the work is being done. The strength could be evenly distributed from beginning to end, but it could also be lumpy.
The real filter could be a combination of an early one and a late one, of course. But, unless the factors are exquisitely well-balanced, its likely that there is one location in civilizational development where most of the filter lies (ie where the probability of getting to the next stage is the lowest).
That’s very non-obvious to me; I can’t see why there couldn’t be (say) a step with probability around 1e-12, one with probability around 1e-7, one with probability around 1e-5, one with probability around 1e-2, and a dozen ones with joint probability around 1e-1, so that no one step comprises the majority (logarithmically) of the filter.
Good point, but note that in your example, the top filter does a lot more than the runner-up. There could be a lot of value in knowing what that top one is, even if the others combined are more important. (But that wouldn’t answer the question, how much danger do we still face? Which may be the biggest question. So again—good point.)
But that wouldn’t answer the question, how much danger do we still face? Which may be the biggest question.
It is. But even there, I’m under the impression that many people are focussing on the answers “most of it” and “hardly any” neglecting everything in between.
A few possible hypotheses (where by “supercomputers” I mean ‘computation capabilities comparable to those available to people on Earth in 2014’):
Nearly all the filter ahead. A sizeable fraction of all star systems have civilizations with supercomputers, and about one in 1e24 of them will take over their future light cone.
Most of the filter ahead. There are about a million civilizations with supercomputers per galaxy in average, and about one in 1e18 of them will take over their future light cone.
Halfway through the filter. There is about one civilization with supercomputers per galaxy in average, and about one in 1e12 of them will take over their future light cone.
Most of the filter behind. There is about one civilization with supercomputers per million galaxies in average, and about one in a million of them will take over their future light cone.
Nearly all of the filter behind. There have been few or no other civilizations with supercomputers than us in our past light cone, and we have a sizeable chance of taking over our future light cone.
ISTM people push too much of their probability mass near the ends and leave too little in the middle for some reason. (In particular, I’m under the impression that certain singularitarians unduly privilege hypothesis 5 just because they can’t imagine a reason why we will fail to take over the future light cone—which is way too Inside View for me—and they think the only thing left to know is whether the takeover will be Good or Bad.)
I think there’s pretty little practical difference between 3 and 4 (unless you value us taking over the future light cone at somewhere between 1e6 and 1e12, on some appropriate scale), and not terribly much between 1 and 4 either, so maybe people are conflating anything below 5 together and that’s what they actually mean when they sound like they say 1?
(Of course there’s 6. Planetarium hypothesis—someone in our past light cone has already taken over their future light cone, but they don’t want us to know for some reason or another—but I don’t think that more than 99.99% of possible superintelligences would give a crap about us inferior beings, so this can explain at most about one sixth of the filter. Just add “visibly” before “take over” everywhere above.)
BTW, I think that 1 is all but ruled out (and 2 is slightly disfavoured) by the failure of SETI (at least if you interpret the present tense to mean ‘as of the time their wordline intersects the surface of our past light cone’), and 5 is unlikely because of Moloch (if he’s scary enough to stop cancer from killing whales he’s totally likely to be scary enough to stop > 99.9% of civilizations with supercomputers from taking over the light cone).
My own probability distribution has a broad peak somewhere around 4 with a long-ish tail to the left.
The real filter could be a combination of an early one and a late one, of course. But, unless the factors are exquisitely well-balanced, its likely that there is one location in civilizational development where most of the filter lies (ie where the probability of getting to the next stage is the lowest).
That doesn’t sound like it admits the possibility of twelve, independent, roughly equally balanced filters.
Maybe I am being uncharitable, but when Sophronius asks “[c]an somebody explain to me why people generally assume that the great filter has a single cause?” and you reply “I don’t think anyone really assumes that”, I have to admit that I’ve always seen people think of the Great Filter in terms of one main cause (e.g., look to the poll in this thread where people choose one particular cause), and not in terms of multiple causes.
Though, you’re right that no one has said that multiple causes is outright impossible. And you may be right that one main cause makes a lot more sense. But I do think Sophronius raises a question worth considering, at least a bit.
At least in other Less Wrong posts and comments on the topic, the question is usually presented probabilistically, as in “Does the bulk of the great filter lie ahead of us or behind us?”
Though, it is usually not specified whether the writer is asking where the greatest number of candidates gets ruled out, or the greatest fraction. Maybe there are ten factors that each eliminate 90% of candidates prior to the technological civilization level (leaving maybe a hundred near-type-I civilizations per galaxy), but one factor (AI? self-destruction by war or ecological collapse? Failing to colonize other worlds before a stray cosmic whatever destroys your homeworld?) takes out 99% of what is left. In that case nearly all the great filter would be behind us, and our odds still wouldn’t be good.
I don’t have an answer but here’s a guess: For any given pre-civilizational state, I imagine there are many filters. If we model these filters as having a kill rate then my (unreliable stats) intuition tells me that a prior on the kill rate distribution should be log-normal. I think this suggests that most of the killing happens on the left-most outlier but someone better at stats should check my assumptions.
I’d liken it to a chemical reaction. Many of them are multistep, and as a general statement chemical processes take place over an extremely wide range of orders of magnitude of rate (ranging from less than a billionth of a second to years). So, in an overall reaction, there are usually several steps, and the slowest one is usually orders of magnitude slower than any of the others, and that one’s called the rate determining step, for obvious reasons: it’s so much slower than the others that speeding up or slow down the others even by a couple of orders of magnitude is negligible to the overall rate of reaction. it’s pretty rare that more than one of them happen to be at nearly the same rate, since the range of orders of magnitude is so large.
I think that the evolution of intelligence is a stochastic process that’s pretty similar to molecular kinetics in a lot of ways, particularly that all of the above applies to it as well, thus, it’s more likely that there’s one rate determining step, one Great Filter, for the same reasons.
However (and I made another post about this here too), I do think that the filters are interdependent (there are multiple pathways and it’s not a linear process, but progress along a multidimensional surface.) that’s not really all that different than molecular kinetics either though.
Interesting. However, I still don’t see why the filter would work similarly to a chemical reaction. Unless it’s a general law of statistics that any event is always far more likely to have a single primary cause, it seems like a strange assumption since they are such dissimilar things.
Sorry for the delayed response; I don’t come on here particularly often.
The assumptions I’m making are that evolution is a stochastic process in which elements are in fluxional states and there ere is some measure of ‘difficulty’ in transitioning from one state to another, an energetic or entropic barrier of sorts, that to go from A to B (for example, from an organism with asexual reproduction to an organism with sexual reproduction) some confluence of factors must occur, and that occurrence has a certain likelihood that’s dependent on the conditions of the whole system (ecosystem). I think that this combined with the large numbers of physical elements interacting (organsims) is enough to say that evolution is governed by something pretty similar to statistical thermodynamics.
So, from the Arrhenius equation,
k = Ae^{{-E_a}/{RT}}
where k is the rate of reaction, A is the order of reaction (number of components that must come together), E_a is the activation energy, or energy barrier, and RT is the gas constant multiplied by temperature.
The equation is mostly applied to chemistry, but it also has found uses in other sectors, like predicting the geographic progression of the blooming of Sakura trees (http://en.wikipedia.org/wiki/Cherry_blossom_front). It really applies to any system that has certain kinetic properties.
So ignoring all the chemistry specific factors (like temperature), the relation in its most general form becomes
k = Ae^-BE
This says essentially that the rate is proportional to a negative exponential of the barrier to the transformation, and small changes in the value of the barrier correspond to large changes in the value of the rate. Thus, it’s unlikely that two rates are similar. I don’t see why two unrelated things would be likely to have a similar barrier, and given this, they’re even less likely to have a similar rate.
The Pareto principle is the 80-20 rule. Meaning if there are 4 great filters each with an effectiveness of 0.9,0.9,0.99, and 0.9, it will be the 0.99 filter that requires ten times the attention / luck and will get the name ‘Great’, even if there are many other hurdles adding to the end result.
This is a troubleshooting problem where we are still on the first stage of the diagnosis: Identifying the largest contributor to the problem, identifying the low hanging fruit. The whole filter that prevents life on every planet is going to the a combination of many stacked smaller filters, but we want to find the most relevant filter, either the 0.999 filter or the locally relevant filter. This discussion is all part of that diagnostic process to find the most relevant single cause.
Can somebody explain to me why people generally assume that the great filter has a single cause? My gut says it’s most likely a dozen one-in-a-million chances that all have to turn out just right for intelligent life to colonize the universe. So the total chance would be 1/1000000^12. Yet everyone talks of a single ‘great filter’ and I don’t get why.
I don’t think anyone really assumes that.
Maybe not explicitly, but I keep seeing people refer to “the great filter” as if it was a single thing. But maybe you’re right and I’m reading too much into this.
Normal filters have multiple layers too—for example first you can have the screen that keeps the large particles out. Then you have the activated charcoal, and then the paper to keep the charcoal out, and for high-end filters you finish off with a porous membrane.
And yet, it’s all one filter.
SO… we’re speculating here on where the bulk of the work is being done. The strength could be evenly distributed from beginning to end, but it could also be lumpy.
From the OP:
That’s very non-obvious to me; I can’t see why there couldn’t be (say) a step with probability around 1e-12, one with probability around 1e-7, one with probability around 1e-5, one with probability around 1e-2, and a dozen ones with joint probability around 1e-1, so that no one step comprises the majority (logarithmically) of the filter.
Good point, but note that in your example, the top filter does a lot more than the runner-up. There could be a lot of value in knowing what that top one is, even if the others combined are more important. (But that wouldn’t answer the question, how much danger do we still face? Which may be the biggest question. So again—good point.)
It is. But even there, I’m under the impression that many people are focussing on the answers “most of it” and “hardly any” neglecting everything in between.
A few possible hypotheses (where by “supercomputers” I mean ‘computation capabilities comparable to those available to people on Earth in 2014’):
Nearly all the filter ahead. A sizeable fraction of all star systems have civilizations with supercomputers, and about one in 1e24 of them will take over their future light cone.
Most of the filter ahead. There are about a million civilizations with supercomputers per galaxy in average, and about one in 1e18 of them will take over their future light cone.
Halfway through the filter. There is about one civilization with supercomputers per galaxy in average, and about one in 1e12 of them will take over their future light cone.
Most of the filter behind. There is about one civilization with supercomputers per million galaxies in average, and about one in a million of them will take over their future light cone.
Nearly all of the filter behind. There have been few or no other civilizations with supercomputers than us in our past light cone, and we have a sizeable chance of taking over our future light cone.
ISTM people push too much of their probability mass near the ends and leave too little in the middle for some reason. (In particular, I’m under the impression that certain singularitarians unduly privilege hypothesis 5 just because they can’t imagine a reason why we will fail to take over the future light cone—which is way too Inside View for me—and they think the only thing left to know is whether the takeover will be Good or Bad.)
I think there’s pretty little practical difference between 3 and 4 (unless you value us taking over the future light cone at somewhere between 1e6 and 1e12, on some appropriate scale), and not terribly much between 1 and 4 either, so maybe people are conflating anything below 5 together and that’s what they actually mean when they sound like they say 1?
(Of course there’s 6. Planetarium hypothesis—someone in our past light cone has already taken over their future light cone, but they don’t want us to know for some reason or another—but I don’t think that more than 99.99% of possible superintelligences would give a crap about us inferior beings, so this can explain at most about one sixth of the filter. Just add “visibly” before “take over” everywhere above.)
BTW, I think that 1 is all but ruled out (and 2 is slightly disfavoured) by the failure of SETI (at least if you interpret the present tense to mean ‘as of the time their wordline intersects the surface of our past light cone’), and 5 is unlikely because of Moloch (if he’s scary enough to stop cancer from killing whales he’s totally likely to be scary enough to stop > 99.9% of civilizations with supercomputers from taking over the light cone).
My own probability distribution has a broad peak somewhere around 4 with a long-ish tail to the left.
From the article:
That doesn’t sound like it admits the possibility of twelve, independent, roughly equally balanced filters.
You’re being uncharitable. “[It’s] likely [that X]” doesn’t exclude the possibility of non-X.
If you know nothing about a probability distribution, it is more likely that it has one absolute maximum than more than one.
Maybe I am being uncharitable, but when Sophronius asks “[c]an somebody explain to me why people generally assume that the great filter has a single cause?” and you reply “I don’t think anyone really assumes that”, I have to admit that I’ve always seen people think of the Great Filter in terms of one main cause (e.g., look to the poll in this thread where people choose one particular cause), and not in terms of multiple causes.
Though, you’re right that no one has said that multiple causes is outright impossible. And you may be right that one main cause makes a lot more sense. But I do think Sophronius raises a question worth considering, at least a bit.
At least in other Less Wrong posts and comments on the topic, the question is usually presented probabilistically, as in “Does the bulk of the great filter lie ahead of us or behind us?”
Though, it is usually not specified whether the writer is asking where the greatest number of candidates gets ruled out, or the greatest fraction. Maybe there are ten factors that each eliminate 90% of candidates prior to the technological civilization level (leaving maybe a hundred near-type-I civilizations per galaxy), but one factor (AI? self-destruction by war or ecological collapse? Failing to colonize other worlds before a stray cosmic whatever destroys your homeworld?) takes out 99% of what is left. In that case nearly all the great filter would be behind us, and our odds still wouldn’t be good.
I don’t have an answer but here’s a guess: For any given pre-civilizational state, I imagine there are many filters. If we model these filters as having a kill rate then my (unreliable stats) intuition tells me that a prior on the kill rate distribution should be log-normal. I think this suggests that most of the killing happens on the left-most outlier but someone better at stats should check my assumptions.
I’d liken it to a chemical reaction. Many of them are multistep, and as a general statement chemical processes take place over an extremely wide range of orders of magnitude of rate (ranging from less than a billionth of a second to years). So, in an overall reaction, there are usually several steps, and the slowest one is usually orders of magnitude slower than any of the others, and that one’s called the rate determining step, for obvious reasons: it’s so much slower than the others that speeding up or slow down the others even by a couple of orders of magnitude is negligible to the overall rate of reaction. it’s pretty rare that more than one of them happen to be at nearly the same rate, since the range of orders of magnitude is so large.
I think that the evolution of intelligence is a stochastic process that’s pretty similar to molecular kinetics in a lot of ways, particularly that all of the above applies to it as well, thus, it’s more likely that there’s one rate determining step, one Great Filter, for the same reasons.
However (and I made another post about this here too), I do think that the filters are interdependent (there are multiple pathways and it’s not a linear process, but progress along a multidimensional surface.) that’s not really all that different than molecular kinetics either though.
Interesting. However, I still don’t see why the filter would work similarly to a chemical reaction. Unless it’s a general law of statistics that any event is always far more likely to have a single primary cause, it seems like a strange assumption since they are such dissimilar things.
Sorry for the delayed response; I don’t come on here particularly often.
The assumptions I’m making are that evolution is a stochastic process in which elements are in fluxional states and there ere is some measure of ‘difficulty’ in transitioning from one state to another, an energetic or entropic barrier of sorts, that to go from A to B (for example, from an organism with asexual reproduction to an organism with sexual reproduction) some confluence of factors must occur, and that occurrence has a certain likelihood that’s dependent on the conditions of the whole system (ecosystem). I think that this combined with the large numbers of physical elements interacting (organsims) is enough to say that evolution is governed by something pretty similar to statistical thermodynamics.
So, from the Arrhenius equation, k = Ae^{{-E_a}/{RT}}
where k is the rate of reaction, A is the order of reaction (number of components that must come together), E_a is the activation energy, or energy barrier, and RT is the gas constant multiplied by temperature.
The equation is mostly applied to chemistry, but it also has found uses in other sectors, like predicting the geographic progression of the blooming of Sakura trees (http://en.wikipedia.org/wiki/Cherry_blossom_front). It really applies to any system that has certain kinetic properties.
So ignoring all the chemistry specific factors (like temperature), the relation in its most general form becomes
k = Ae^-BE
This says essentially that the rate is proportional to a negative exponential of the barrier to the transformation, and small changes in the value of the barrier correspond to large changes in the value of the rate. Thus, it’s unlikely that two rates are similar. I don’t see why two unrelated things would be likely to have a similar barrier, and given this, they’re even less likely to have a similar rate.
Pareto principle.
There also the Fermi formula.
I think you need to explain your meaning more
The Pareto principle is the 80-20 rule. Meaning if there are 4 great filters each with an effectiveness of 0.9,0.9,0.99, and 0.9, it will be the 0.99 filter that requires ten times the attention / luck and will get the name ‘Great’, even if there are many other hurdles adding to the end result.
This is a troubleshooting problem where we are still on the first stage of the diagnosis: Identifying the largest contributor to the problem, identifying the low hanging fruit. The whole filter that prevents life on every planet is going to the a combination of many stacked smaller filters, but we want to find the most relevant filter, either the 0.999 filter or the locally relevant filter. This discussion is all part of that diagnostic process to find the most relevant single cause.