Transcription errors and gaps (I definitely haven’t got them all, and in some cases I haven’t got them despite trying because there are skips in the audio and I couldn’t confidently reconstruct what had been skipped):
going through my own mini existential crisis right now [p6, 0xLucas] limited edition collectible (singular) [p9, 0xLucas] at collectibles.bankless.com [p9, 0xLucas] live Twitter spaces (plural) [p9, 0xLucas] (dunno what the ?? there is, though) one-of-one edition [p16, 0xLucas] randomly assigned [p16, 0xLucas] not quite desktop [p27, Eliezer] to this podcast probably has an above average IQ [p29, Eliezer] smarter than chimpanzees _in the past_ ? [p29, Eliezer] over _the_ last billion years [p33, Eliezer] there’s evolution before that but it’s, look, like pretty slow, just like single cell stuff. [p33, Eliezer] (yes, “Bing” is correct) [p35, Eliezer] has sufficient leadership focus [p35, Eliezer] oh, they they could spend the resources [p37, Eliezer] _directing_ more resources into the field [p43, Eliezer] to head up Google’s AI projects [p43, Eliezer] than the question of whether to close[d]-source the stuff [p48, Eliezer] (I think “pure escape” might be “purest cap...”) And you can’t calculate _exactly_ how large is too large. [p53, Eliezer] it’s not just like a simple empirical chunk [p55, Eliezer] But these black swan events … [p56, David] this present feels full of human ingenuity [p57, Eliezer] the human ingenuity that’s being directed at that is much larger but also … [p57, Eliezer] like mass or whatever [p59, Eliezer] could possibly try to align [p59, Eliezer] tying to build a aligned AI [p60, Eliezer] so that the earth is still a park after [p64, Eliezer]
Transcription errors and gaps (I definitely haven’t got them all, and in some cases I haven’t got them despite trying because there are skips in the audio and I couldn’t confidently reconstruct what had been skipped):
going through my own mini existential crisis right now [p6, 0xLucas]
limited edition collectible (singular) [p9, 0xLucas]
at collectibles.bankless.com [p9, 0xLucas]
live Twitter spaces (plural) [p9, 0xLucas]
(dunno what the ?? there is, though)
one-of-one edition [p16, 0xLucas]
randomly assigned [p16, 0xLucas]
not quite desktop [p27, Eliezer]
to this podcast probably has an above average IQ [p29, Eliezer]
smarter than chimpanzees _in the past_ ? [p29, Eliezer]
over _the_ last billion years [p33, Eliezer]
there’s evolution before that but it’s, look, like pretty slow, just like single cell stuff. [p33, Eliezer]
(yes, “Bing” is correct) [p35, Eliezer]
has sufficient leadership focus [p35, Eliezer]
oh, they they could spend the resources [p37, Eliezer]
_directing_ more resources into the field [p43, Eliezer]
to head up Google’s AI projects [p43, Eliezer]
than the question of whether to close[d]-source the stuff [p48, Eliezer]
(I think “pure escape” might be “purest cap...”)
And you can’t calculate _exactly_ how large is too large. [p53, Eliezer]
it’s not just like a simple empirical chunk [p55, Eliezer]
But these black swan events … [p56, David]
this present feels full of human ingenuity [p57, Eliezer]
the human ingenuity that’s being directed at that is much larger but also … [p57, Eliezer]
like mass or whatever [p59, Eliezer]
could possibly try to align [p59, Eliezer]
tying to build a aligned AI [p60, Eliezer]
so that the earth is still a park after [p64, Eliezer]
It’s pretty accurate I think. Thanks for going through it!