I think I’ll also second whpearson’s suggestion for early on getting a bit of practice/use with the tools you may need later on, so that when you really need them, won’t be as big of an issue.
Actually, for what it’s worth… you know what? I think I ought finally do a bit more than just offering good wishes. I think I’ll treat this as the last bit of “excuse” I need, recalibrate my sense of scale of the problems (or at least remind me of it) by treating this as a base to multiply by and, well… I’m going to deliberately take Eliezer’s “evil joke” from yesterday with regards to to the bystander effect (pointing and saying “you. save the world”) seriously and personally and I respond with “I accept.”
It was at the Summit. He was using the example of… I forgot the percentages, but if five people witness someone having an epileptic seizure, it is less likely that they will get help than if only one person witnesses it.
So he pointed to a random person in the audience and directed them to save the world, instead of directing that mandate at the audience as a whole.
Soon as I figure details out. (Which, actually, partly involves me waiting on a response from someone else.)
Actually, no, before that.
Having made my decision, that doesn’t automatically cause a ray of light to shine, revealing the optimal (or even reasonably good subotimal way) to act on it. :) (The universe just isn’t that convenient. Actually, a universe that convenient and nice would probably be the kind of place that didn’t have as much existential risk for us in the first place.)
There we go, just found your email address. You’ll indeed see a message from me shortly.
Not really: the way the psychology works is, each person is inhibited from taking action if nobody else is. Once one person starts doing something, it often becomes easier for others to follow.
augh! No, don’t do that. (Er… don’t not do that… You know what I mean!) Right now I’m in a state of “what the heck did I just commit myself to while I suffered from a terrible moment of sanity? Given my current skills/etc, how am I going to do this?” While trying to avoid, well, retreating from “I will” to “I’ll try” or any other form of running away from the problem.
Now I see this and my reaction is not entirely unlike “EEEEEEEEK! I haven’t even begun and I’ve already made it WORSE!”
I’ve thougt about this situation, and I would like to know the context in which Eliezer said this. Was he talking about creating friendly AI? This was my first thought, but he’s also talked about the dangers of running headlong into that problem.
In the typical bystander problem, you don’t have to be a doctor or a medic, you sometimes just have to dial 911. You probably don’t need any specific skills to save the world. You just have to make sure you don’t delude yourself into thinking your actions are sufficient.
And don’t worry, I promise not to not save the world.
Save the world? From what, exactly? Black holes? Gamma ray bursts? The eventual expansion of the Sun? UnFriendly Seed AI? Asteroid impacts? Nuclear weapons? Runaway climate change? Peak Oil? The Year 2032 problem? Annoying Christmas songs?
“Make the world a much better place” seems a somewhat less ill-defined command.
Ouch, nasty, damn.
Well, good for you for, well, choosing to live!
I think I’ll also second whpearson’s suggestion for early on getting a bit of practice/use with the tools you may need later on, so that when you really need them, won’t be as big of an issue.
Actually, for what it’s worth… you know what? I think I ought finally do a bit more than just offering good wishes. I think I’ll treat this as the last bit of “excuse” I need, recalibrate my sense of scale of the problems (or at least remind me of it) by treating this as a base to multiply by and, well… I’m going to deliberately take Eliezer’s “evil joke” from yesterday with regards to to the bystander effect (pointing and saying “you. save the world”) seriously and personally and I respond with “I accept.”
Where was this joke? I don’t find it in Eliezer’s user page or a Google search.
It was at the Summit. He was using the example of… I forgot the percentages, but if five people witness someone having an epileptic seizure, it is less likely that they will get help than if only one person witnesses it.
So he pointed to a random person in the audience and directed them to save the world, instead of directing that mandate at the audience as a whole.
I hope to see an email in my inbox with details shortly.
And sent.
Soon as I figure details out. (Which, actually, partly involves me waiting on a response from someone else.)
Actually, no, before that.
Having made my decision, that doesn’t automatically cause a ray of light to shine, revealing the optimal (or even reasonably good subotimal way) to act on it. :) (The universe just isn’t that convenient. Actually, a universe that convenient and nice would probably be the kind of place that didn’t have as much existential risk for us in the first place.)
There we go, just found your email address. You’ll indeed see a message from me shortly.
Awesome.
Agreed, but on the other hand he shouldn’t say he’s doing so, because now I don’t have to.
Not really: the way the psychology works is, each person is inhibited from taking action if nobody else is. Once one person starts doing something, it often becomes easier for others to follow.
(My answer? “Yes, I’m trying.”)
augh! No, don’t do that. (Er… don’t not do that… You know what I mean!) Right now I’m in a state of “what the heck did I just commit myself to while I suffered from a terrible moment of sanity? Given my current skills/etc, how am I going to do this?” While trying to avoid, well, retreating from “I will” to “I’ll try” or any other form of running away from the problem.
Now I see this and my reaction is not entirely unlike “EEEEEEEEK! I haven’t even begun and I’ve already made it WORSE!”
I’ve thougt about this situation, and I would like to know the context in which Eliezer said this. Was he talking about creating friendly AI? This was my first thought, but he’s also talked about the dangers of running headlong into that problem.
In the typical bystander problem, you don’t have to be a doctor or a medic, you sometimes just have to dial 911. You probably don’t need any specific skills to save the world. You just have to make sure you don’t delude yourself into thinking your actions are sufficient.
And don’t worry, I promise not to not save the world.
Save the world? From what, exactly? Black holes? Gamma ray bursts? The eventual expansion of the Sun? UnFriendly Seed AI? Asteroid impacts? Nuclear weapons? Runaway climate change? Peak Oil? The Year 2032 problem? Annoying Christmas songs?
“Make the world a much better place” seems a somewhat less ill-defined command.
I’m positive that most actions that make the world a much better place would be insufficient to deal with whatever threats Eliezer was talking about.
Doing that is just finding a good sounding reason to not save the world.