Do The Math, Then Burn The Math and Go With Your Gut

WikiLast edit: 24 Dec 2021 19:57 UTC by riceissa

Do the math, then burn the math and go with your gut1 is a procedure for decision-making that has been described by Eliezer Yudkowsky. However, in the context of economics research, the same procedure has been described by economist Alfred Marshall as early as 1906. The basic procedure is to go through the process of assigning numbers and probabilities that are relevant to some decision (“do the math”) and then to throw away this calculation and instead make the final decision with one’s gut feelings (“burn the math and go with your gut”). The purpose of the first step is to force oneself to think through all the details of the decision and to spot inconsistencies.

History

In 1906, the English economist Alfred Marshall described the procedure in the context of economics research in a letter to Arthur Bowley. From the letter:

But I know I had a growing feeling in the later years of my work at the subject that a good mathematical theorem dealing with economic hypotheses was very unlikely to be good economics: and I went more and more on the rules---(1) Use mathematics as a short-hand language, rather than as an engine of inquiry. (2) Keep to them till you have done. (3) Translate into English. (4) Then illustrate by examples that are important in real life. (5) Burn the mathematics. (6) If you can’t succeed in 4, burn 3. This last I did often.

In July 2008, Eliezer Yudkowsky wrote the blog post “When (Not) To Use Probabilities”, which discusses the situations under which it is a bad idea to verbally assign probabilities. Specifically, the post claims that while theoretical arguments in favor of using probabilities (such as Dutch book and coherence arguments) always apply, humans have evolved algorithms for reasoning under uncertainty that don’t involve verbally assigning probabilities (such as using “gut feelings”), which in practice often perform better than actually assigning probabilities. In other words, the post argues in favor of using humans’ non-verbal/​built-in forms of reasoning under uncertainty even if this makes humans incoherent/​subject to Dutch books, because forcing humans to articulate probabilities would actually lead to worse outcomes. The post also contains the quote “there are benefits from trying to translate your gut feelings of uncertainty into verbal probabilities. It may help you spot problems like the conjunction fallacy. It may help you spot internal inconsistencies – though it may not show you any way to remedy them.”2

In October 2011, LessWrong user bentarm gave an outline of the procedure in a comment in the context of the Amanda Knox case. The steps were: “(1) write down a list of all of the relevant facts on either side of the argument. (2) assign numerical weights to each of the facts, according to how much they point you in one direction or another. (3) burn the piece of paper on which you wrote down the facts, and go with your gut.” This description was endorsed by Yudkowsky in a follow-up comment. bentarm’s comment claims that Yudkowsky described the procedure during summer of 2011.3

In December 2012, the procedure was described by Yudkowsky in Chapter 86 of Harry Potter and the Methods of Rationality:

Harry was wondering if he could even get a Bayesian calculation out of this. Of course, the point of a subjective Bayesian calculation wasn’t that, after you made up a bunch of numbers, multiplying them out would give you an exactly right answer. The real point was that the process of making up numbers would force you to tally all the relevant facts and weigh all the relative probabilities. Like realizing, as soon as you actually thought about the probability of the Dark Mark not-fading if You-Know-Who was dead, that the probability wasn’t low enough for the observation to count as strong evidence. One version of the process was to tally hypotheses and list out evidence, make up all the numbers, do the calculation, and then throw out the final answer and go with your brain’s gut feeling after you’d forced it to really weigh everything. The trouble was that the items of evidence weren’t conditionally independent, and there were multiple interacting background facts of interest...

In December 2016, Anna Salamon described the procedure parenthetically at the end of a blog post. Salamon described the procedure as follows: “Eliezer once described what I take to be the a similar ritual for avoiding bucket errors, as follows: When deciding which apartment to rent (he said), one should first do out the math, and estimate the number of dollars each would cost, the number of minutes of commute time times the rate at which one values one’s time, and so on. But at the end of the day, if the math says the wrong thing, one should do the right thing anyway.”4

See also

References

External links


  1. Qiaochu Yuan. “Qiaochu_Yuan comments on A Sketch of Good Communication”. March 31, 2018. LessWrong.

  2. Eliezer Yudkowsky. “When (Not) To Use Probabilities”. July 23, 2008. LessWrong.

  3. bentarm. “bentarm comments on Amanda Knox: post mortem”. October 21, 2011. LessWrong.

  4. Anna Salamon. “‘Flinching away from truth’ is often about *protecting* the epistemology”. December 20, 2016. LessWrong.

No comments.