Much to my surprise, Richard Dawkins and Jon Stewart had a fairly reasonable conversation about existential risk on the Sept. 24, 2013 edition of The Daily Show. Here’s how it went down:
STEWART: Here’s my proposal… for the discussion tonight. Do you believe that the end of our civilization will be through religious strife or scientific advancement? What do you think in the long run will be more damaging to our prospects as a human race?
In reply, Dawkins says Martin Rees (of CSER) thinks humanity has a 50% chance of surviving the 21st century, and one cause for such worry is that powerful technologies could get into the hands of religious fanatics. Stewart replies:
STEWART: …[But] isn’t there a strong probability that we are not necessarily in control of the unintended consequences of our scientific advancement?… Don’t you think it’s even more likely that we will create something [for which] the unintended consequence… is worldwide catastrophe?
DAWKINS: That is possible. It’s something we have to worry about… Science is the most powerful to do whatever you want to do. If you want to do good, it’s the most powerful way to do good. If you want to do evil, it’s the most powerful way to do evil.
STEWART: …You have nuclear energy and you go this way and you can light the world, but you go this [other] way, and you can blow up the world. It seems like we always try [the blow up the world path] first.
DAWKINS: There is a suggestion that one of the reasons that we don’t detect extraterrestrial civilizations is that when a civilization reaches the point where it could broadcast radio waves that we could pick up, there’s only a brief window before it blows itself up… It takes many billions of years for evolution to reach the point where technology takes off, but once technology takes off, it’s then an eye-blink — by the standards of geological time — before...
STEWART: …It’s very easy to look at the dark side of fundamentalism… [but] sometimes I think we have to look at the dark side of achievement… because I believe the final words that man utters on this Earth will be: “It worked!” It’ll be an experiment that isn’t misused, but will be a rolling catastrophe.
DAWKINS: It’s a possibility, and I can’t deny it. I’m more optimistic than that.
STEWART: … [I think] curiosity killed the cat, and the cat never saw it coming… So how do we put the brakes on our ability to achieve, or our curiosity?
DAWKINS: I don’t think you can ever really stop the march of science in the sense of saying “You’re forbidden to exercise your natural curiosity in science.” You can certainly put the brakes on certain applications. You could stop manufacturing certain weapons. You could have… international agreements not to manufacture certain types of weapons...
And then the conversation shifted back to religion. I wish Dawkins had mentioned CSER’s existence.
And then later in the (extended, online-only) interview, Stewart seemed unsure as to whether consciousness persisted after one’s brain rotted, and also unaware that 10^22 is a lot bigger than a billion. :(
Jon’s what I call normal-smart. He spends most of his time watching TV, mainly US news programs, and they’re quite destructive to rational thinking, even if the purpose is for comedic fodder and to discover hypocrisy. He’s very tech averse, letting the guests he has on the show come in with information he might use, trusting (quite good) intuition to fit things into reality. As such, I like to use him as an example of what more normal people feel about tech / geek issues.
Every time he has one of these debates, I really want to sit down as moderator so I can translate each side, since they often talk past each other. Alas, it’s a very time restricted format, and I’ve only seen him fact check on the fly once (Google, Wikipedia).
The number thing was at least partly a joke, along the lines of “bigger than 10 doesn’t make much sense to me”—scope insensitivity humor. I’ve done similar before.
I’m beginning to think that we shouldn’t be surprised by reasonably intelligent atheists having reasonable thoughts about x-risk. Both of the two reasonably intelligent, non-LWer atheists I talked to in the past few weeks about LW issues agreed with everything I said on them and said that it all seemed sensible and non-surprising. Most LW users started out as reasonably intelligent atheists. Where did the “zomg everyone is so dumb and only LW can think” meme originate from, exactly? Is there any hard data on this topic?
Much to my surprise, Richard Dawkins and Jon Stewart had a fairly reasonable conversation about existential risk on the Sept. 24, 2013 edition of The Daily Show. Here’s how it went down:
STEWART: Here’s my proposal… for the discussion tonight. Do you believe that the end of our civilization will be through religious strife or scientific advancement? What do you think in the long run will be more damaging to our prospects as a human race?
In reply, Dawkins says Martin Rees (of CSER) thinks humanity has a 50% chance of surviving the 21st century, and one cause for such worry is that powerful technologies could get into the hands of religious fanatics. Stewart replies:
STEWART: …[But] isn’t there a strong probability that we are not necessarily in control of the unintended consequences of our scientific advancement?… Don’t you think it’s even more likely that we will create something [for which] the unintended consequence… is worldwide catastrophe?
DAWKINS: That is possible. It’s something we have to worry about… Science is the most powerful to do whatever you want to do. If you want to do good, it’s the most powerful way to do good. If you want to do evil, it’s the most powerful way to do evil.
STEWART: …You have nuclear energy and you go this way and you can light the world, but you go this [other] way, and you can blow up the world. It seems like we always try [the blow up the world path] first.
DAWKINS: There is a suggestion that one of the reasons that we don’t detect extraterrestrial civilizations is that when a civilization reaches the point where it could broadcast radio waves that we could pick up, there’s only a brief window before it blows itself up… It takes many billions of years for evolution to reach the point where technology takes off, but once technology takes off, it’s then an eye-blink — by the standards of geological time — before...
STEWART: …It’s very easy to look at the dark side of fundamentalism… [but] sometimes I think we have to look at the dark side of achievement… because I believe the final words that man utters on this Earth will be: “It worked!” It’ll be an experiment that isn’t misused, but will be a rolling catastrophe.
DAWKINS: It’s a possibility, and I can’t deny it. I’m more optimistic than that.
STEWART: … [I think] curiosity killed the cat, and the cat never saw it coming… So how do we put the brakes on our ability to achieve, or our curiosity?
DAWKINS: I don’t think you can ever really stop the march of science in the sense of saying “You’re forbidden to exercise your natural curiosity in science.” You can certainly put the brakes on certain applications. You could stop manufacturing certain weapons. You could have… international agreements not to manufacture certain types of weapons...
And then the conversation shifted back to religion. I wish Dawkins had mentioned CSER’s existence.
And then later in the (extended, online-only) interview, Stewart seemed unsure as to whether consciousness persisted after one’s brain rotted, and also unaware that 10^22 is a lot bigger than a billion. :(
Jon’s what I call normal-smart. He spends most of his time watching TV, mainly US news programs, and they’re quite destructive to rational thinking, even if the purpose is for comedic fodder and to discover hypocrisy. He’s very tech averse, letting the guests he has on the show come in with information he might use, trusting (quite good) intuition to fit things into reality. As such, I like to use him as an example of what more normal people feel about tech / geek issues.
Every time he has one of these debates, I really want to sit down as moderator so I can translate each side, since they often talk past each other. Alas, it’s a very time restricted format, and I’ve only seen him fact check on the fly once (Google, Wikipedia).
The number thing was at least partly a joke, along the lines of “bigger than 10 doesn’t make much sense to me”—scope insensitivity humor. I’ve done similar before.
I’m beginning to think that we shouldn’t be surprised by reasonably intelligent atheists having reasonable thoughts about x-risk. Both of the two reasonably intelligent, non-LWer atheists I talked to in the past few weeks about LW issues agreed with everything I said on them and said that it all seemed sensible and non-surprising. Most LW users started out as reasonably intelligent atheists. Where did the “zomg everyone is so dumb and only LW can think” meme originate from, exactly? Is there any hard data on this topic?