The efficient market hypothesis? Are you serious?
So… you’re saying the world is going to end and nobody is doing anything to avoid it, but I can’t say that a stock is going to appreciate and nobody is buying.
You’ve deleted the first part of your comment cause you probably realized it didn’t make much sense, but I’m going to answer to it anyway. You made a comparison between solving the alignment problem and predicting the price of a stock, and that’s just not right. Google execs don’t have to solve the alignment problem themselves, they just have to recognize its existence and its magnitude, in the same way that retail investors don’t have to build the AGI themselves, they just have to notice that it’s going to happen soon.
You’ve deleted the first part of your comment cause you probably realized it didn’t make much sense
I deleted it cause my comment sounds cooler in my head if I leave the explanation out, and also I was tired of arguing.
You made a comparison between solving the alignment problem and predicting the price of a stock, and that’s just not right. Google execs don’t have to solve the alignment problem themselves, they just have to recognize its existence and its magnitude, in the same way that retail investors don’t have to build the AGI themselves, they just have to notice that it’s going to happen soon.
The point I was (maybe poorly) trying to make was that Google execs are not individually incentivized to lobby their company to prevent AGI collapse in the same way that hedge fund managers are incentivized to predict Google’s stock price. Those executives are not getting paid to delay AGI timelines, and many are getting paid not to delay AGI timelines.
AGI prevention is a coordination problem. Securities pricing is a technical problem. In the same way society is really bad at tax law, or preventing global warming, and really good at video game development, society is really bad at AGI alignment and really good at pricing securities. And thus oil barons continue producing oil and Google continues producing AGI research.
Yeah, but let’s be honest; oil barons don’t think climate change is going to kill them. Capitalism may produce all sorts of coordination problems, but personal survival is still the strongest incentive. I think Google execs wouldn’t hesitate to stop the research if they were expecting a paperclip maximizer.
I think you’re being naive. But it doesn’t really matter. Oil barons, in practice, also tend to convince themselves climate change is a hoax, or rationalize their participation away with “if we don’t do it somebody else will”. That’s what the vast majority of Google executives would do if it got to the point where they started worrying a bit, and unfortunately the social pressure isn’t sufficient to even drive them there yet.
The efficient market hypothesis? Are you serious? So… you’re saying the world is going to end and nobody is doing anything to avoid it, but I can’t say that a stock is going to appreciate and nobody is buying.
Yeah, pretty much. Welcome to Earth.
You’ve deleted the first part of your comment cause you probably realized it didn’t make much sense, but I’m going to answer to it anyway. You made a comparison between solving the alignment problem and predicting the price of a stock, and that’s just not right. Google execs don’t have to solve the alignment problem themselves, they just have to recognize its existence and its magnitude, in the same way that retail investors don’t have to build the AGI themselves, they just have to notice that it’s going to happen soon.
I deleted it cause my comment sounds cooler in my head if I leave the explanation out, and also I was tired of arguing.
The point I was (maybe poorly) trying to make was that Google execs are not individually incentivized to lobby their company to prevent AGI collapse in the same way that hedge fund managers are incentivized to predict Google’s stock price. Those executives are not getting paid to delay AGI timelines, and many are getting paid not to delay AGI timelines.
AGI prevention is a coordination problem. Securities pricing is a technical problem. In the same way society is really bad at tax law, or preventing global warming, and really good at video game development, society is really bad at AGI alignment and really good at pricing securities. And thus oil barons continue producing oil and Google continues producing AGI research.
Yeah, but let’s be honest; oil barons don’t think climate change is going to kill them. Capitalism may produce all sorts of coordination problems, but personal survival is still the strongest incentive. I think Google execs wouldn’t hesitate to stop the research if they were expecting a paperclip maximizer.
I think you’re being naive. But it doesn’t really matter. Oil barons, in practice, also tend to convince themselves climate change is a hoax, or rationalize their participation away with “if we don’t do it somebody else will”. That’s what the vast majority of Google executives would do if it got to the point where they started worrying a bit, and unfortunately the social pressure isn’t sufficient to even drive them there yet.