“Big Data needs Big Model” (converting non-random Xbox-based polling into accurate election forecasts by modeling the non-randomness & adjusting for it)
“Non-industry-Sponsored Preclinical Studies on Statins Yield Greater Efficacy Estimates Than Industry-Sponsored Studies: A Meta-Analysis”, Krauth et al 2014 (Typically when you look at study results with an industry funding variable, you find that industry studies are biased upwards—this is the sort of study that comes up in books like Bad Pharma—but here we seem to see the opposite: it’s the non-industry, academic/nonprofit/government, funding which seems to be biased towards finding effects. Interestingly, this is for studies early in the drug pipeline, while IIRC the usual studies examine drugs later in the approval pipeline and which have reached human clinical trials. This immediately suggests an economic rationale: early in the process, drug companies have incentives to reach true results in order to avoid investing much in drugs which won’t ultimately work; but later in the process, because they’ve managed to get a drug close to approval, they have incentives to cook the books in order to try to force approval regardless. So for preliminary results, you would want to distrust academic work and trust industry findings, but then at some point flip your assessments and start assuming the opposite. Makes me wonder what the midpoint is where neither group is more untrustworthy?)
A journalist talks about the irrational fears of parenting. It includes a good hearing of a hard-nosed rationalist view, but it’s not posed as a debate.
The day I left my son in the car
The second idea proposed in 1932 by Bohr and Kronig was that superconductivity would result from the coherent quantum motion of a lattice of electrons. Given Bloch’s stature in the field, theorists like Niels Bohr where eager to discuss their own ideas with him. In fact Bohr, whose theory for superconductivity was already accepted for publication in the July 1932 issue of the journal “Die Naturwissenschaften”, withdrew his article in the proof stage, because of Bloch’s criticism (see Ref.[20]). Kronig was most likely also aware of Bloch’s opinion when he published his ideas[22]. Only months after the first publication he responded to the criticism made by Bohr and Bloch in a second manuscript[23]. It is tempting to speculate that his decision to publish and later defend his theory was influenced by an earlier experience: in 1925 Kronig proposed that the electron carries spin, i.e. possesses an internal angular momentum. Wolfgang Pauli’s response to this idea was that it was interesting but incorrect, which discouraged Kronig from publishing it. The proposal for the electron spin was made shortly thereafter by Samuel Goudsmit and George Uhlenbeck[29]. Kronig might have concluded that it is not always wise to follow the advice of an established and respected expert.
“History of what didn’t work” seems like an important genre, for example if you want help avoiding hindsight/survivorship biases. Are there other good examples? It seems a lot of histories of science impose a false sense of direction or inevitability and don’t cover many dead ends if any; all I can think of are some biographies that cover a lone genius’s missteps on his way to the true theory.
Pseudoscience is sometimes useful for finding examples—there’s a whole subclass of pseudosciences (particularly in alternative medicine and pseudophysics) that are based on advocating an old formerly-mainstream theory that turned out to be wrong. It would almost be a reliable way to generate new alternative medicines.
Short Online Texts Thread
Medicine:
“Do We Really Know What Makes Us Healthy?” (Gary Taubes, 2007)
“The End of Food: Has a tech entrepreneur come up with a product to replace our meals?” (Soylent)
Low-dose aspirin for mortality reduction revisited
Creatine self-experiment
Economics:
“SSC Gives A Graduation Speech” (on the value of college and alternatives)
“What You Should Know About Megaprojects and Why: An Overview”, Flyvbjerg 2014 (excerpts)
“Why Do Firms Buy Ads?” (advertising requires impossibly large sample sizes for meaningful results)
“The Business Habits of Highly Effective Terrorists: Why Terror Masterminds Rely on Micro-Management”
Inferring nuclear bomb components from stock market returns
The Sewing Machine Patent Wars
Politics:
“Hit or Miss? The Effect of Assassinations on Institutions and War”, Jones & Olken 2007 (excerpts)
“The Borgias vs Borgia: Faith and Fear (accuracy in historical fiction)”
“800 Years Of Human Sacrifice In Kent”
“King of Fearmongers: Morris Dees and the Southern Poverty Law Center, scaring donors since 1971”
“BUGGER: maybe the real state secret is that spies aren’t very good at their jobs”
“Beyond the One Percent: Categorizing Extreme Elites”
“free speech rights and ability”
“Exploring Elitist Democracy: The Latest from Gilens and Page”
Psychology:
“Common DNA Markers Can Account for More Than Half of the Genetic Influence on Cognitive Abilities”
Many Labs project published
“Slow Ideas: Some innovations spread fast. How do you speed the ones that don’t?”
“New meta-analysis checks the correlation between intelligence and faith: First systematic analysis of its kind even proposes reasons for the negative correlation”
Biases in grocery shopping
Philosophy:
“Don’t Fear the (Great) Filter”
“Humans are Utility Monsters”
“Universal Library”, by W.O. Quine
Literature:
“Sand Kings”, GRRM
“The Island”, Peter Watts
“Calmly We Walk through This April’s Day”, Delmore Schwartz
“Fermat’s Last Stand: Soundtrack and Adventure Log”
Parable of the Unjust Steward
Thanks for filling up my Pocket queue!
Technology:
“Exponential and non-exponential trends in information technology” (LW)
“The Three Projections of Dr Futamura” (isomorphisms between compilers/interpreters/etc)
Framing Brian Krebs with heroin
“It’s the Latency, Stupid”
Medieval computer science: “STOC 1500”
“A World Without Randomness”
“Life Inside Brewster’s Magnificent Contraption” (Jason Scott on the Internet Archive)
“Mundane Magic”
Sand as a form of power storage
Statistics:
“Search for the Wreckage of Air France Flight AF 447”, Stone et al 2014 (technical report)
“What do null fields tell use about scientific fraud?”
“A whole fleet of gremlins: Looking more carefully at Richard Tol’s twice-corrected paper, ‘The Economic Effects of Climate Change’”
“Theory-testing in psychology and physics: a methodological paradox”, Meehl 1967 (excerpts)
“What Bayesianism Taught Me”
“The robust beauty of improper linear models in decision making”
“Big Data needs Big Model” (converting non-random Xbox-based polling into accurate election forecasts by modeling the non-randomness & adjusting for it)
What are statistical models?
“Non-industry-Sponsored Preclinical Studies on Statins Yield Greater Efficacy Estimates Than Industry-Sponsored Studies: A Meta-Analysis”, Krauth et al 2014 (Typically when you look at study results with an industry funding variable, you find that industry studies are biased upwards—this is the sort of study that comes up in books like Bad Pharma—but here we seem to see the opposite: it’s the non-industry, academic/nonprofit/government, funding which seems to be biased towards finding effects. Interestingly, this is for studies early in the drug pipeline, while IIRC the usual studies examine drugs later in the approval pipeline and which have reached human clinical trials. This immediately suggests an economic rationale: early in the process, drug companies have incentives to reach true results in order to avoid investing much in drugs which won’t ultimately work; but later in the process, because they’ve managed to get a drug close to approval, they have incentives to cook the books in order to try to force approval regardless. So for preliminary results, you would want to distrust academic work and trust industry findings, but then at some point flip your assessments and start assuming the opposite. Makes me wonder what the midpoint is where neither group is more untrustworthy?)
How to Measure Anything review
Science:
“Predictive brains, situated agents, and the future of cognitive science”, Clark 2013
“Detection of Near-Earth Asteroids”
“Cosmic Horror: In which we confront the terrible racism of H. P. Lovecraft”
“How Athletes Get Great: Just train for 10,000 hours, right? Not quite. In his new book, author David Epstein argues that top-shelf athletic performance may be a more complicated formula than we’ve recently come to believe.”
A bit of humor: World’s Supercomputers Release Study Confirming they are not Powerful Enough
A journalist talks about the irrational fears of parenting. It includes a good hearing of a hard-nosed rationalist view, but it’s not posed as a debate. The day I left my son in the car
Failed theories of superconductivity. My favorite part:
“History of what didn’t work” seems like an important genre, for example if you want help avoiding hindsight/survivorship biases. Are there other good examples? It seems a lot of histories of science impose a false sense of direction or inevitability and don’t cover many dead ends if any; all I can think of are some biographies that cover a lone genius’s missteps on his way to the true theory.
Pseudoscience is sometimes useful for finding examples—there’s a whole subclass of pseudosciences (particularly in alternative medicine and pseudophysics) that are based on advocating an old formerly-mainstream theory that turned out to be wrong. It would almost be a reliable way to generate new alternative medicines.