If you’re expecting the singularity within a century, does it make sense to put any thought into eugenics except for efforts to make it easy to avoid the worst genetic disorders?
This could be generalised to putting any thought into anything. Will the singularity be achieved within one childhood? More smart people may be useful to apply to the problem. If you’re smart, make more smart people.
That seems to depend on a number of assumptions—your timeline, whether you expect a soft or a hard takeoff, the centrality of raw intelligence vs. cultural effects to research quality, possible nonlinearity of network effects on intellectual output. But I’d bet that the big one is time: if you think (unrealistically, but run with it) that you can improve a test population’s intelligence by 50%, that could be very significant if you’re expecting a 2100 singularity but likely won’t be if you’re expecting one before they graduate from college.
Depends on the confidence with which you expect it. If you’re 95+% confident, probably not. Lower? Probably yes. Even an intervention with only 10% chance of ever mattering may be worth doing if its value if successful is at least 10x greater than its cost+opportunity cost.
If you’re expecting the singularity within a century, does it make sense to put any thought into eugenics except for efforts to make it easy to avoid the worst genetic disorders?
This could be generalised to putting any thought into anything. Will the singularity be achieved within one childhood? More smart people may be useful to apply to the problem. If you’re smart, make more smart people.
That seems to depend on a number of assumptions—your timeline, whether you expect a soft or a hard takeoff, the centrality of raw intelligence vs. cultural effects to research quality, possible nonlinearity of network effects on intellectual output. But I’d bet that the big one is time: if you think (unrealistically, but run with it) that you can improve a test population’s intelligence by 50%, that could be very significant if you’re expecting a 2100 singularity but likely won’t be if you’re expecting one before they graduate from college.
Depends on the confidence with which you expect it. If you’re 95+% confident, probably not. Lower? Probably yes. Even an intervention with only 10% chance of ever mattering may be worth doing if its value if successful is at least 10x greater than its cost+opportunity cost.
Good point. The cutoff is not necessarily the singularity, either—once we have sufficiently awesome genetic engineering, there’s no point to eugenics.