there’s a big difference between saying it saves a few years vs. causes us to have a chance at all when we otherwise wouldn’t. [...] it seems like most of the relevant ideas were already in the memespace
I was struck by the 4th edition of AI: A Modern Approach quoting Nobert Weiner writing in 1960 (!), “If we use, to achieve our purposes, a mechanical agency with whose operation we cannot interfere effectively … we had better be quite sure that the purpose put into the machine is the purpose which we really desire.”
It must not have seemed like a pressing issue in 1960, but Weiner noticed the problem! (And Yudkowsky didn’t notice, at first.) How much better off are our analogues in the worlds where someone like Weiner (or, more ambitiously, Charles Babbage) did treat it as a pressing issue? How much measure do they have?
Yeah, it’s quite plausible that it might have taken another decade (then again I don’t know if Bostrom thought super-intelligence was possible before encountering Eliezer)
I was struck by the 4th edition of AI: A Modern Approach quoting Nobert Weiner writing in 1960 (!), “If we use, to achieve our purposes, a mechanical agency with whose operation we cannot interfere effectively … we had better be quite sure that the purpose put into the machine is the purpose which we really desire.”
It must not have seemed like a pressing issue in 1960, but Weiner noticed the problem! (And Yudkowsky didn’t notice, at first.) How much better off are our analogues in the worlds where someone like Weiner (or, more ambitiously, Charles Babbage) did treat it as a pressing issue? How much measure do they have?
Yeah, it’s quite plausible that it might have taken another decade (then again I don’t know if Bostrom thought super-intelligence was possible before encountering Eliezer)