Eliezer is certainly a smart guy, but I haven’t personally seen anything indicating that his coding skills are especially high-level.
A perhaps better comparison to make would be how much Eliezer would make as a programmer assuming he had worked as a programmer for the duration of the existence of SIAI. That is something of a middle ground between SIAI valuing Eliezer less because he has been working at SIAI and valuing him ridiculously high because he is the only person who has experience doing his job at the SIAI (because the other geniuses have been working in industry).
Maybe, but if SIAI’s goal is just to employ Eliezer for as little money as possible then that’s not an important consideration.
The real reason SIAI wants to pay Eliezer money beyond what he needs to subsist by is so he can buy luxuries for himself, have financial security, have whatever part of his brain that associates high income with high status be satisfied, and feel good about his employment at SIAI. These are good reasons to pay Eliezer more than a living wage. If Eliezer didn’t have any utility for money beyond the first $50K I don’t think it would be sensible to pay it to him more than that. I don’t see how hypothetical programming careers come in to any of this.
ETA: I guess maybe the hypotheticals could be important if we’re trying to encourage young geniuses to follow Eliezer’s path instead of getting careers in industry?
Maybe, but if SIAI’s goal is just to employ Eliezer for as little money as possible then that’s not an important consideration.
It actually is—because both of the bounds I mentioned would come into the bargaining (if that was how the pay was to be determined). What Eliezer could get elsewhere is one consideration—but what SIAI could get as an alternative to Eliezer. It would only be if Eliezer was an utterly incompetent negotiator that SIAI could force Eliezer to accept the minimum of what he would get elsewhere. And we know he is a lot better at game theory than that!
I don’t see how hypothetical programming careers come in to any of this.
Merely as a less-bad reference point than a programming career based on current programming skills. It was your point (as adapted from the OP), not mine. I’m merely pointing out why the comparison is unreasonable as stated.
Sure, Eliezer’s value to SIAI is also an important component to negotiations.
But the reference point doesn’t actually matter for anything, even if quoting reference points is a common dark arts negotiation tactic (tantamount to saying “it’s standard” in order to get people to accept something). I’d guess that most of negotiation is dark arts.
Yep. The way it actually works is that I’m on the critical path for our organizational mission, and paying me less would require me to do things that take up time and energy in order to get by with a smaller income. Then, assuming all goes well, future intergalactic civilizations would look back and think this was incredibly stupid; in much the same way that letting billions of person-containing brains rot in graves, and humanity allocating less than a million dollars per year to the Singularity Institute, would predictably look pretty stupid in retrospect. At Singularity Institute board meetings we at least try not to do things which will predictably make future intergalactic civilizations think we were being willfully stupid. That’s all there is to it, and no more.
I have an image of Eliezer queued up in a coffee shop, guiltily eyeing up the assortment of immodestly priced sugary treats. The reptilian parts of his brain have commandeered the more recently evolved parts of his brain into fervently computing the hedonic calculus of an action that other, more foolish types, might misclassify as a sordid instance of discretionary spending. Caught staring into the glaze of a particularly sinful muffin, he now faces a crucial choice. A cognitive bias, thought to have been eradicated from his brain before the SIAI was founded, seizes its moment. “I’ll take the triple chocolate muffin thank you” Eliezer blurts out. “Are you sure?” the barista asks. “Well I can’t be 100% sure. But the future of intergalactic civilizations may very well depend on it!”
In accordance with the general fact that “calories in—calories out” is complete bullshit, I’ve had to learn that sweet things are not their caloric content, they are pharmaceutical weight-gain pills with effects far in excess of their stated caloric content. So no, I wouldn’t be able to eat a triple chocolate muffin, or chocolate cake, or a donut, etcetera. But yes, when I still believed the bullshit and thought the cost was just the stated caloric content, I sometimes didn’t resist.
I fully expect that less than 0.1% of mathematicians are working on math anywhere near as important as starting a chain of paleo coffee shops. What are you working on?
Fluid dynamics. Considering jumping over to computational neuroscience.
I’ve put some serious thought into a paleo coffee shop. It’s definitely on my list of potential extra-academic endeavors if I end up leaving my ivory tower.
A fast search suggests that there aren’t any paleo restaurants, and possibly not even paleo sections on menus, so there might just be a business opportunity.
There isn’t really a rigorous definition of the diet. One guideline some people use is that you shouldn’t eat anything you wouldn’t eat raw, which excludes beans. Coffee beans aren’t actually beans though. I wouldn’t be surprised if some people consider coffee not paleo, but there are big names in the paleo scene that drink coffee (Kurt Harris, Art de Vany).
Really, I would say paleo is more a philosophy for how to go about honing in on a diet, rather than a particular diet in and of itself. There are hard lines, like chocolate muffins. I don’t think coffee is close to that line though.
You’re right, the guideline is not too well worded. You should probably replace “what you wouldn’t eat raw” with “what would be toxic to eat raw”.
Meat is edible raw. There’s nothing inherently toxic about uncooked meat. Many other foods require cooking to diminish their toxicity (potatoes, grains, legumes). There’s definitely concern about parasites in raw meat, but parasites are not an inherent quality of the meat itself.
There’s actually a whole raw paleo sub-subculture. I wouldn’t recommend it personally, and I’m not keen to try it myself, but it’s there.
I think it’s likely humans are evolved to eat cooked food. The guideline don’t eat anything you wouldn’t eat raw isn’t intended to dissuade people to not eat cooked food, but rather to serve as a heuristic for foods that were probably less commonly eaten by our ancestors. It’s unclear to me how accurate the heuristic is. A big counterexample is tubers. Tubers are widely eaten by modern hunter-gatherers and are toxic when uncooked.
Is this based on taubes or is there more to support it? I found his demolition of calories in calories out fairly convincing but wasn’t wholly convinced by his carb-demonizing
In what way do you consider “calories in—calories out” complete bullshit? (My guess as to your answer: knock-on effects w.r.t. a Seth-Roberts-style set point of some kind.)
Probably in the same sense that people mean, under generous interpretations, when they say “The Laffer Curve is bullshit”—which is to say, it’s technically true, but not a relevant insight for this part of the problem space, given more significant factors in success.
Sure. I’m curious about what EY sees as the specific “more significant factors in [why sweet things are obstacles to] success [in excess of their stated calories]”.
Oh, okay. Probably should have known I couldn’t provide what you were looking for, but I wanted to get in a jab at confused critiques of the Laffer Curve and confused applications of conservation of energy to weight loss. :-)
Thankyou. That appears to be an entirely reasonable explanation. (Where ‘explanation’ is not to signal ‘needed to be justified’ but rather ‘interesting to know’.)
That is rather peculiar reasoning to hear from you. You seem to be acting with a level of self-importance that would only be justified if there will be some future being that will torture trans-Singularity trans-humans for not having done enough to accelerate the onset of the Singularity.
So in the language of BATNA, my point was that if a high powered software development career was not Eliezer’s BATNA, citing it in a negotiation would be a dark arts move, just like putting a sign in a grocery store next to a display of tomato cans that said “limit 12 per customer” would be a dark arts move (exploiting anchoring and adjustment).
My current model for negotiations is that the only two numbers that really matter are the maximum price the buyer will pay and the minimum price the seller will sell at. Figuring out what number actually gets chosen (which will be somewhere in between those two) is where the dark art of negotiating comes in. You could cite your BATNA as evidence that indicates your walkaway point was some high or low value, in an attempt to reveal information about your preferences, but if you cited a BATNA that you didn’t actually have that’d be a dark arts tactic.
I’d define dark arts as “attempting to influence the actions of others in ways other than helping them arrive at more correct beliefs or resolve inconsistent preferences”. I’m not entirely certain about the “resolve inconsistent preferences” bit though—this is kind of a weird zone, in my thinking—what part of me decides how to resolve inconsistencies in my preferences? Is it my preferences themselves?
I hate to be this deep in this discussion because I don’t mean to dis Eliezer, but I’d love to improve my thinking about negotiations if it’s wrong.
A perhaps better comparison to make would be how much Eliezer would make as a programmer assuming he had worked as a programmer for the duration of the existence of SIAI. That is something of a middle ground between SIAI valuing Eliezer less because he has been working at SIAI and valuing him ridiculously high because he is the only person who has experience doing his job at the SIAI (because the other geniuses have been working in industry).
Maybe, but if SIAI’s goal is just to employ Eliezer for as little money as possible then that’s not an important consideration.
The real reason SIAI wants to pay Eliezer money beyond what he needs to subsist by is so he can buy luxuries for himself, have financial security, have whatever part of his brain that associates high income with high status be satisfied, and feel good about his employment at SIAI. These are good reasons to pay Eliezer more than a living wage. If Eliezer didn’t have any utility for money beyond the first $50K I don’t think it would be sensible to pay it to him more than that. I don’t see how hypothetical programming careers come in to any of this.
ETA: I guess maybe the hypotheticals could be important if we’re trying to encourage young geniuses to follow Eliezer’s path instead of getting careers in industry?
It actually is—because both of the bounds I mentioned would come into the bargaining (if that was how the pay was to be determined). What Eliezer could get elsewhere is one consideration—but what SIAI could get as an alternative to Eliezer. It would only be if Eliezer was an utterly incompetent negotiator that SIAI could force Eliezer to accept the minimum of what he would get elsewhere. And we know he is a lot better at game theory than that!
Merely as a less-bad reference point than a programming career based on current programming skills. It was your point (as adapted from the OP), not mine. I’m merely pointing out why the comparison is unreasonable as stated.
Sure, Eliezer’s value to SIAI is also an important component to negotiations.
But the reference point doesn’t actually matter for anything, even if quoting reference points is a common dark arts negotiation tactic (tantamount to saying “it’s standard” in order to get people to accept something). I’d guess that most of negotiation is dark arts.
It must be weird to be Eliezer Yudkowsky reading this.
Yep. The way it actually works is that I’m on the critical path for our organizational mission, and paying me less would require me to do things that take up time and energy in order to get by with a smaller income. Then, assuming all goes well, future intergalactic civilizations would look back and think this was incredibly stupid; in much the same way that letting billions of person-containing brains rot in graves, and humanity allocating less than a million dollars per year to the Singularity Institute, would predictably look pretty stupid in retrospect. At Singularity Institute board meetings we at least try not to do things which will predictably make future intergalactic civilizations think we were being willfully stupid. That’s all there is to it, and no more.
I have an image of Eliezer queued up in a coffee shop, guiltily eyeing up the assortment of immodestly priced sugary treats. The reptilian parts of his brain have commandeered the more recently evolved parts of his brain into fervently computing the hedonic calculus of an action that other, more foolish types, might misclassify as a sordid instance of discretionary spending. Caught staring into the glaze of a particularly sinful muffin, he now faces a crucial choice. A cognitive bias, thought to have been eradicated from his brain before the SIAI was founded, seizes its moment. “I’ll take the triple chocolate muffin thank you” Eliezer blurts out. “Are you sure?” the barista asks. “Well I can’t be 100% sure. But the future of intergalactic civilizations may very well depend on it!”
In accordance with the general fact that “calories in—calories out” is complete bullshit, I’ve had to learn that sweet things are not their caloric content, they are pharmaceutical weight-gain pills with effects far in excess of their stated caloric content. So no, I wouldn’t be able to eat a triple chocolate muffin, or chocolate cake, or a donut, etcetera. But yes, when I still believed the bullshit and thought the cost was just the stated caloric content, I sometimes didn’t resist.
Luckily a juicy porterhouse steak is a nice stand-in for a triple chocolate muffin. Unfortunately they don’t tend to sell them at coffee shops.
Perhaps I’ll end my career as a mathematician to start a paleo coffee shop.
I fully expect that less than 0.1% of mathematicians are working on math anywhere near as important as starting a chain of paleo coffee shops. What are you working on?
Fluid dynamics. Considering jumping over to computational neuroscience.
I’ve put some serious thought into a paleo coffee shop. It’s definitely on my list of potential extra-academic endeavors if I end up leaving my ivory tower.
A fast search suggests that there aren’t any paleo restaurants, and possibly not even paleo sections on menus, so there might just be a business opportunity.
Is coffee in the paleo diet?
There isn’t really a rigorous definition of the diet. One guideline some people use is that you shouldn’t eat anything you wouldn’t eat raw, which excludes beans. Coffee beans aren’t actually beans though. I wouldn’t be surprised if some people consider coffee not paleo, but there are big names in the paleo scene that drink coffee (Kurt Harris, Art de Vany).
Really, I would say paleo is more a philosophy for how to go about honing in on a diet, rather than a particular diet in and of itself. There are hard lines, like chocolate muffins. I don’t think coffee is close to that line though.
That surprises me. The paleo diet I know includes meat, which you should cook in order to kill parasites.
You’re right, the guideline is not too well worded. You should probably replace “what you wouldn’t eat raw” with “what would be toxic to eat raw”.
Meat is edible raw. There’s nothing inherently toxic about uncooked meat. Many other foods require cooking to diminish their toxicity (potatoes, grains, legumes). There’s definitely concern about parasites in raw meat, but parasites are not an inherent quality of the meat itself.
There’s actually a whole raw paleo sub-subculture. I wouldn’t recommend it personally, and I’m not keen to try it myself, but it’s there.
There’s also a theory that the development of cooking was responsible for the evolutionary Great Leap Forward.
I think it’s likely humans are evolved to eat cooked food. The guideline don’t eat anything you wouldn’t eat raw isn’t intended to dissuade people to not eat cooked food, but rather to serve as a heuristic for foods that were probably less commonly eaten by our ancestors. It’s unclear to me how accurate the heuristic is. A big counterexample is tubers. Tubers are widely eaten by modern hunter-gatherers and are toxic when uncooked.
Tea might be even if coffee isn’t.
Is this based on taubes or is there more to support it? I found his demolition of calories in calories out fairly convincing but wasn’t wholly convinced by his carb-demonizing
In what way do you consider “calories in—calories out” complete bullshit? (My guess as to your answer: knock-on effects w.r.t. a Seth-Roberts-style set point of some kind.)
Probably in the same sense that people mean, under generous interpretations, when they say “The Laffer Curve is bullshit”—which is to say, it’s technically true, but not a relevant insight for this part of the problem space, given more significant factors in success.
Sure. I’m curious about what EY sees as the specific “more significant factors in [why sweet things are obstacles to] success [in excess of their stated calories]”.
Oh, okay. Probably should have known I couldn’t provide what you were looking for, but I wanted to get in a jab at confused critiques of the Laffer Curve and confused applications of conservation of energy to weight loss. :-)
Very skillful exploitation of the humor potential of this thread of conversation! Bravo!
Thankyou. That appears to be an entirely reasonable explanation. (Where ‘explanation’ is not to signal ‘needed to be justified’ but rather ‘interesting to know’.)
That is rather peculiar reasoning to hear from you. You seem to be acting with a level of self-importance that would only be justified if there will be some future being that will torture trans-Singularity trans-humans for not having done enough to accelerate the onset of the Singularity.
And that’s just stupid.
If citing BATNAs in negotiations is ‘dark arts’ then the phrase is even more useless than I thought.
So in the language of BATNA, my point was that if a high powered software development career was not Eliezer’s BATNA, citing it in a negotiation would be a dark arts move, just like putting a sign in a grocery store next to a display of tomato cans that said “limit 12 per customer” would be a dark arts move (exploiting anchoring and adjustment).
My current model for negotiations is that the only two numbers that really matter are the maximum price the buyer will pay and the minimum price the seller will sell at. Figuring out what number actually gets chosen (which will be somewhere in between those two) is where the dark art of negotiating comes in. You could cite your BATNA as evidence that indicates your walkaway point was some high or low value, in an attempt to reveal information about your preferences, but if you cited a BATNA that you didn’t actually have that’d be a dark arts tactic.
I’d define dark arts as “attempting to influence the actions of others in ways other than helping them arrive at more correct beliefs or resolve inconsistent preferences”. I’m not entirely certain about the “resolve inconsistent preferences” bit though—this is kind of a weird zone, in my thinking—what part of me decides how to resolve inconsistencies in my preferences? Is it my preferences themselves?
I hate to be this deep in this discussion because I don’t mean to dis Eliezer, but I’d love to improve my thinking about negotiations if it’s wrong.
I’d never heard that phrase before. Thanks for introducing it to me.
EDIT: To be clearer, I mean BATNA, not “dark arts.”