I am listening to your talk with Lex Friedman and it occurs to me that, I agree our abilities to do are bypassing our abilities to understand consequences of our doing. Just look at more recent technology, say smart phones. For all the good they do they also do untold harm. much less than AGI could do, but humans have a bad habit of introducing technology into the general popluation for the sake of money without considering the consequences. How do we change that basic human drive? I can imagine a lot of good AGI could do, and harm, and I fear the profit motive will, as always push AGI out, consequences be damned. Perhaps it is humans short life span that allows smart people to deny the future. Interested in your thoughts. Thanks
I am listening to your talk with Lex Friedman and it occurs to me that, I agree our abilities to do are bypassing our abilities to understand consequences of our doing. Just look at more recent technology, say smart phones. For all the good they do they also do untold harm. much less than AGI could do, but humans have a bad habit of introducing technology into the general popluation for the sake of money without considering the consequences. How do we change that basic human drive? I can imagine a lot of good AGI could do, and harm, and I fear the profit motive will, as always push AGI out, consequences be damned. Perhaps it is humans short life span that allows smart people to deny the future. Interested in your thoughts. Thanks