Have you spent much time working in labs? Its been my experience that most of the work is data collection, where the process you are collecting data on is the limiting factor. Honestly can’t think of any lab I’ve been apart of where data collection was not the rate limiting step.
Here are the first examples that popped into my head:
Consider Lenski’s work on E.coli. It took from 1988-2010 to get to 50k generations (and is going). The experimental design phase and data analysis here are minimal in length compared to the time it takes e.coli to grow and breed.
It took 3 years to go from the first potential top quark events on record (1992) to actual discovery (1995). This time was just waiting for enough events to build up (I’m ignoring the 20 years between prediction and first-events because maybe a super-intelligence could have somehow narrowed down the mass range to explore, I’m also ignoring the time required to actually build an accelerator, thats 3 years of just letting the machine run).
Depending on what you are looking for, timescales in NMR collection are weeks to months. If your signal is small, you might need dozens of these runs.
Also, anyone who has ever worked with a low temperature system can tell you that keeping the damn thing working is a huge time sink. So you could add ‘necessary machine maintenance’ to these sorts of tasks. Its not obvious to me that leak checking your cryonics setup to troubleshoot can be sped up much by higher IQ.
Thank you for the examples, I see your point. I can imagine ways 300-IQ AIs would accelerate some of these that sound plausible to me, but since I don’t really have direct experience that might not mean much.
That said, I notice that the bluej’s post mentioned the AI dominating scientific output, not necessarily increasing its rate by much. Of course, a single AI instance would not dominate science—as evidenced by the fact that the few ~200 IQ humans that existed didn’t claim a big part—but an AI architecture that can be easily replicated might. After all, at least as far as IQ is concerned, anyone who hires an IQ 140–160 scientist now would just use an IQ 300 AI instead.
Of course, science is not just IQ, and even if IBM’s Watson had IQ 300 right now and I doubt enough instances of it would be built in five years to replace all scientists simply due to hardware costs (not to mention licensing and patent wars). But then again I don’t have a very good feel for the relative cost of humans and hardware for things the size of Google, so I don’t have very high confidence either way. But certainly 20 to 30 years would change the landscape hugely.
Have you spent much time working in labs? Its been my experience that most of the work is data collection, where the process you are collecting data on is the limiting factor. Honestly can’t think of any lab I’ve been apart of where data collection was not the rate limiting step.
Here are the first examples that popped into my head:
Consider Lenski’s work on E.coli. It took from 1988-2010 to get to 50k generations (and is going). The experimental design phase and data analysis here are minimal in length compared to the time it takes e.coli to grow and breed.
It took 3 years to go from the first potential top quark events on record (1992) to actual discovery (1995). This time was just waiting for enough events to build up (I’m ignoring the 20 years between prediction and first-events because maybe a super-intelligence could have somehow narrowed down the mass range to explore, I’m also ignoring the time required to actually build an accelerator, thats 3 years of just letting the machine run).
Depending on what you are looking for, timescales in NMR collection are weeks to months. If your signal is small, you might need dozens of these runs.
Also, anyone who has ever worked with a low temperature system can tell you that keeping the damn thing working is a huge time sink. So you could add ‘necessary machine maintenance’ to these sorts of tasks. Its not obvious to me that leak checking your cryonics setup to troubleshoot can be sped up much by higher IQ.
No, I did not, and it shows :-)
Thank you for the examples, I see your point. I can imagine ways 300-IQ AIs would accelerate some of these that sound plausible to me, but since I don’t really have direct experience that might not mean much.
That said, I notice that the bluej’s post mentioned the AI dominating scientific output, not necessarily increasing its rate by much. Of course, a single AI instance would not dominate science—as evidenced by the fact that the few ~200 IQ humans that existed didn’t claim a big part—but an AI architecture that can be easily replicated might. After all, at least as far as IQ is concerned, anyone who hires an IQ 140–160 scientist now would just use an IQ 300 AI instead.
Of course, science is not just IQ, and even if IBM’s Watson had IQ 300 right now and I doubt enough instances of it would be built in five years to replace all scientists simply due to hardware costs (not to mention licensing and patent wars). But then again I don’t have a very good feel for the relative cost of humans and hardware for things the size of Google, so I don’t have very high confidence either way. But certainly 20 to 30 years would change the landscape hugely.