The main problem with the 1 million versus 7 million idea is that losing 100 versus gaining 200 is nowhere near describing accurate utility to pretty much anyone who doesn’t already have a lot of money.
Perhaps the real lesson is that quantifying something doesn’t make it more accurate; it just moves around which parts you need to get accurate. Doing a calculation with inaccurate numbers is no better than just coming up with an equally inaccurate result without calculation. It’s also easier to make certain kinds of errors in the first place when you’re trying to quantify things and aren’t very good at doing it, even though Scott insists otherwise. Or to just use an overly simplified model without enough humility about your ability to create a useful model.
Doing a calculation doesn’t leave you free to not sanity-check your results either, which is a problem with a lot of rationalist calculations.
The main problem with the 1 million versus 7 million idea is that losing 100 versus gaining 200 is nowhere near describing accurate utility to pretty much anyone who doesn’t already have a lot of money.
Perhaps the real lesson is that quantifying something doesn’t make it more accurate; it just moves around which parts you need to get accurate. Doing a calculation with inaccurate numbers is no better than just coming up with an equally inaccurate result without calculation. It’s also easier to make certain kinds of errors in the first place when you’re trying to quantify things and aren’t very good at doing it, even though Scott insists otherwise. Or to just use an overly simplified model without enough humility about your ability to create a useful model.
Doing a calculation doesn’t leave you free to not sanity-check your results either, which is a problem with a lot of rationalist calculations.