Not sure what reasons the Wall Street Journal gave (haven’t seen the article), but they’re probably right. Faster spread doesn’t change the total number of cases all that much v. slow spread but it does reduce the average number of sequential infections each lineage experiences before the disease burns out, thus the evolutionary distance possible for the disease to achieve.
Exactly my logic. The more generations you have to work with (and apply selective pressure to), the easier it is to produce a highly-optimized virus (the thing we don’t want).
Let’s suppose that each person-to-person transmission adds mutations that affect the “badness” of the virus (say, immunity escape). I would assume you get a normal distribution. Now, suppose we want to make a virus that’s 4 standard deviations above average.
If we have to do this in one generation… Wikipedia tells us that the rate of samples being >4 standard deviations away from the mean is roughly 1 in 16k; since that includes both below and above, we would need 32k infections on average to produce the +4sd virus.
If we have two generations, then what we can do is produce a +2sd virus in the first generation, select it (assume we can do this perfectly), and use it to produce candidates for the +4sd virus. +2sd is 1 in 44, so on average we would need 44 infections in the first generation to make the +2sd virus, then make 44 more from it, for a total of 88 infections. With 4 generations, we need 6 infections per generation to gain +1sd, totaling 24 infections.
If we wanted to make a +6sd virus: 1 generation requires 1 billion infections, 2 generations requires 740 x 2 = 1480 infections, 3 generations takes 44 x 3 = 132 infections, and 6 generations takes 6 x 6 = 36 infections. If we wanted to make Delta… I have no idea how many standard deviations it is from the original SARS-CoV-2, but I’m confident that we wouldn’t get anywhere near it if COVID Patient Zero infected the entire planet by direct transmission.
For another angle: Suppose you have a budget of 1000 infections. If you have 1 generation, then you can gain about +3sd in that generation. 6 generations, you can get +2.5 per generation, totaling +15sd. 20 generations, you get about +2sd per generation, totaling +40sd. 70 generations gives you about +1.5sd per generation, yielding +105sd.
Changing the size of each generation has only a marginal effect on how many stddevs you can get from it. The probability density function for standard deviations decreases superexponentially with the distance from the mean. Much more important is the number of generations you have.
Now, this is with utterly perfect selection on every generation, if we take the best (worst) virus from each generation and delete all the rest. Luckily real selection pressures are a lot weaker than that. But I think models would essentially be “if you’re infecting N people in each generation, then the next generation’s virus will be, on average, +x standard deviations more infective/immune-escaping/etc. than the previous”, and I’m pretty sure the fact will remain that doubling or even tripling N will have only a small effect on x. An intuition pump: Shrinking the generation size means taking people who would have been infected by virus version n (plus mutations), and instead infecting them with version n+1 (plus mutations); clearly this accelerates development.
Therefore, if, say, 4 billion people will get infected, and if we want the virus to be as little evolved as possible by the end, then we should make this happen in as few generations as possible. (This to be balanced against the risk of having so many people sick at once that (a) health care systems are overwhelmed or (b) other services have too many workers out sick. Also, there is something to be said for waiting for treatments like Paxlovid, although my impression is that’s not a practical goal for most people.)
The point is, it might not matter what we do with omicron, the next VOC can still come out of some animal viral pool from a virus variant we know nothing about
it does reduce the average number of sequential infections each lineage experiences before the disease burns out
Does it, though? As a thought experiment say we just let just let A.1 rip. No lockdowns, no nothing. It burns though the world some number of months, say 3-4.
Would we not see lineages evolve in that time? There would likely be more replications in that scenarios than in the approach we’ve taken now. So maybe I can imagine there’s reason to think we’d see more linages in that scenario than we do now.
Not sure what reasons the Wall Street Journal gave (haven’t seen the article), but they’re probably right. Faster spread doesn’t change the total number of cases all that much v. slow spread but it does reduce the average number of sequential infections each lineage experiences before the disease burns out, thus the evolutionary distance possible for the disease to achieve.
Exactly my logic. The more generations you have to work with (and apply selective pressure to), the easier it is to produce a highly-optimized virus (the thing we don’t want).
Let’s suppose that each person-to-person transmission adds mutations that affect the “badness” of the virus (say, immunity escape). I would assume you get a normal distribution. Now, suppose we want to make a virus that’s 4 standard deviations above average.
If we have to do this in one generation… Wikipedia tells us that the rate of samples being >4 standard deviations away from the mean is roughly 1 in 16k; since that includes both below and above, we would need 32k infections on average to produce the +4sd virus.
If we have two generations, then what we can do is produce a +2sd virus in the first generation, select it (assume we can do this perfectly), and use it to produce candidates for the +4sd virus. +2sd is 1 in 44, so on average we would need 44 infections in the first generation to make the +2sd virus, then make 44 more from it, for a total of 88 infections. With 4 generations, we need 6 infections per generation to gain +1sd, totaling 24 infections.
If we wanted to make a +6sd virus: 1 generation requires 1 billion infections, 2 generations requires 740 x 2 = 1480 infections, 3 generations takes 44 x 3 = 132 infections, and 6 generations takes 6 x 6 = 36 infections. If we wanted to make Delta… I have no idea how many standard deviations it is from the original SARS-CoV-2, but I’m confident that we wouldn’t get anywhere near it if COVID Patient Zero infected the entire planet by direct transmission.
For another angle: Suppose you have a budget of 1000 infections. If you have 1 generation, then you can gain about +3sd in that generation. 6 generations, you can get +2.5 per generation, totaling +15sd. 20 generations, you get about +2sd per generation, totaling +40sd. 70 generations gives you about +1.5sd per generation, yielding +105sd.
Changing the size of each generation has only a marginal effect on how many stddevs you can get from it. The probability density function for standard deviations decreases superexponentially with the distance from the mean. Much more important is the number of generations you have.
Now, this is with utterly perfect selection on every generation, if we take the best (worst) virus from each generation and delete all the rest. Luckily real selection pressures are a lot weaker than that. But I think models would essentially be “if you’re infecting N people in each generation, then the next generation’s virus will be, on average, +x standard deviations more infective/immune-escaping/etc. than the previous”, and I’m pretty sure the fact will remain that doubling or even tripling N will have only a small effect on x. An intuition pump: Shrinking the generation size means taking people who would have been infected by virus version n (plus mutations), and instead infecting them with version n+1 (plus mutations); clearly this accelerates development.
Therefore, if, say, 4 billion people will get infected, and if we want the virus to be as little evolved as possible by the end, then we should make this happen in as few generations as possible. (This to be balanced against the risk of having so many people sick at once that (a) health care systems are overwhelmed or (b) other services have too many workers out sick. Also, there is something to be said for waiting for treatments like Paxlovid, although my impression is that’s not a practical goal for most people.)
Delta and Omicron are not in same sequence. Serotyping separating is happening. All the pains and deaths may not worth it.
I’m not sure what your point is here? Omicron is, to my understanding, replacing delta, which should be benefecial.
Omicron is Delta’s “nephew”
The point is, it might not matter what we do with omicron, the next VOC can still come out of some animal viral pool from a virus variant we know nothing about
Does it, though? As a thought experiment say we just let just let A.1 rip. No lockdowns, no nothing. It burns though the world some number of months, say 3-4.
Would we not see lineages evolve in that time? There would likely be more replications in that scenarios than in the approach we’ve taken now. So maybe I can imagine there’s reason to think we’d see more linages in that scenario than we do now.