Bah! Listen, Eliezer, I’m tired of all your meta-hipsterism!
“Hey, let’s get some ethics at Starbucks” “Nah, it’s low-quality; I only buy a really obscure brand of ethics you’ve probably never heard of called MIRI”. “Hey man, you don’t look in good health, maybe you should see a doctor” “Nah, I like a really obscure form of healthcare, I bet you’re not signed up for it, it’s called ‘cryonics’; it’s the cool thing to do”. “I think I like you, let’s date” “Oh, I’m afraid I only date polyamorists; you’re just too square”. “Oh man, I just realized I committed hindsight bias the other day!” “I disagree, it’s really the more obscure backfire effect which just got published a year or two ago.” “Yo, check out this thing I did with statistics” “That’s cool. Did you use Bayesian techniques?”
Man, forget you!
/angrily sips his obscure mail-order loose tea, a kind of oolong you’ve never heard of (Formosa vintage tie-guan-yin)
This comment has been brought to you by me switching from Dvorak to Colemak.
I’m always amazed that people advocate Dvorak. If you are going to diverge from the herd and be a munchkin why do a half-assed job of it? Sure, if you already know Dvorak it isn’t worth switching but if you are switching from Qwerty anyway then Colemak (or at least Capewell) is better than Dvorak in all the ways that Dvorak is better than Qwerty.
If you can’t pick something non-average to meet your optimization criteria, you can’t optimize above the average.
But at the same time, there’s only so many possible low-hanging fruits etc, and at some level of finding more fruits, that indicates you aren’t optimizing at all...
(Had to google “backfire effect” to find out whether you had made it up on the spot.)
EDIT: Looks like I had already heard of that effect, and I even seem to recall E.T. Jaynes giving a theoretical explanation of it, but I didn’t remember whether it had a name.
Had to google “backfire effect” to find out whether you had made it up on the spot.
“Like I said, it’s a really obscure bias, you’ve probably never heard of it.”
I even seem to recall E.T. Jaynes giving a theoretical explanation of it
Really? I don’t remember ever seeing anything like that (although I haven’t read all of PT:TLoS yet). Maybe you’re conflating it with the thesis using Bayesian methods I link in http://www.gwern.net/backfire-effect ?
Bah! Listen, Eliezer, I’m tired of all your meta-hipsterism!
“Hey, let’s get some ethics at Starbucks” “Nah, it’s low-quality; I only buy a really obscure brand of ethics you’ve probably never heard of called MIRI”. “Hey man, you don’t look in good health, maybe you should see a doctor” “Nah, I like a really obscure form of healthcare, I bet you’re not signed up for it, it’s called ‘cryonics’; it’s the cool thing to do”. “I think I like you, let’s date” “Oh, I’m afraid I only date polyamorists; you’re just too square”. “Oh man, I just realized I committed hindsight bias the other day!” “I disagree, it’s really the more obscure backfire effect which just got published a year or two ago.” “Yo, check out this thing I did with statistics” “That’s cool. Did you use Bayesian techniques?”
Man, forget you!
/angrily sips his obscure mail-order loose tea, a kind of oolong you’ve never heard of (Formosa vintage tie-guan-yin)
If you can’t pick something non-average to meet your optimization criteria, you can’t optimize above the average.
This comment has been brought to you by my Dvorak keyboard layout.
If you keep looking down the utility gradient, it’s harder to escape local maxima because you’re facing backwards.
This comment has been brought to you by me switching from Dvorak to Colemak.
I’m always amazed that people advocate Dvorak. If you are going to diverge from the herd and be a munchkin why do a half-assed job of it? Sure, if you already know Dvorak it isn’t worth switching but if you are switching from Qwerty anyway then Colemak (or at least Capewell) is better than Dvorak in all the ways that Dvorak is better than Qwerty.
Dvorak is for hipsters, not optimisers.
Tim Tyler is the actual optimizer here.
But at the same time, there’s only so many possible low-hanging fruits etc, and at some level of finding more fruits, that indicates you aren’t optimizing at all...
Ouch, that cuts a bit close to home...
(Had to google “backfire effect” to find out whether you had made it up on the spot.)
EDIT: Looks like I had already heard of that effect, and I even seem to recall E.T. Jaynes giving a theoretical explanation of it, but I didn’t remember whether it had a name.
“Like I said, it’s a really obscure bias, you’ve probably never heard of it.”
Really? I don’t remember ever seeing anything like that (although I haven’t read all of PT:TLoS yet). Maybe you’re conflating it with the thesis using Bayesian methods I link in http://www.gwern.net/backfire-effect ?
I can’t tell if I should feel good or bad that this was the only one where I said “well, actually...”