Great book. It was percolating around CFAR a few months back—I (Dan from CFAR) read it, several other people read at least part of the book or my notes on it, and we had some conversations about it. A few things from the book that stuck out to me (although some may have been slightly distorted by memory):
the definition of “measurement of X” as anything you do that reduces your uncertainty about X (which is nice and Bayesian)
the first step in dealing with a problem, which Hubbard often had to lead people through when they brought him in as a consultant, is being specific about what the concrete issue at stake is, and why it matters. e.g., translating IT security, into things like “people being unable to work due to network downtime.” (CFAR already had a unit on Being Specific, and it turned out that Hubbard had an exercise that was extremely similar to the Monday-Tuesday game that we were using)
the importance of the skill of calibrated estimation, and calibration techniques discussed in the OP
the value of Fermi estimation—Hubbard said that the Fermi method of decomposing a business question into subcomponents was usually necessary, and sometimes sufficient, for figuring out what to do
Hubbard also has an approach for combining Fermi estimation with calibrated confidence intervals on subcomponents, and using Monte Carlo simulation to get a calibrated confidence interval for the main question. It would be cool to get that method down, but I haven’t used it.
Before you seek out information, identify what information would actually be useful—would this information change what I do? Figure out the value of information. VOI already was part of the LWidealibrary and the subject of a CFAR unit, but I suspect that How to Measure Anything has helped me internalize that question.
Great book. It was percolating around CFAR a few months back—I (Dan from CFAR) read it, several other people read at least part of the book or my notes on it, and we had some conversations about it. A few things from the book that stuck out to me (although some may have been slightly distorted by memory):
the definition of “measurement of X” as anything you do that reduces your uncertainty about X (which is nice and Bayesian)
the first step in dealing with a problem, which Hubbard often had to lead people through when they brought him in as a consultant, is being specific about what the concrete issue at stake is, and why it matters. e.g., translating IT security, into things like “people being unable to work due to network downtime.” (CFAR already had a unit on Being Specific, and it turned out that Hubbard had an exercise that was extremely similar to the Monday-Tuesday game that we were using)
the importance of the skill of calibrated estimation, and calibration techniques discussed in the OP
the value of Fermi estimation—Hubbard said that the Fermi method of decomposing a business question into subcomponents was usually necessary, and sometimes sufficient, for figuring out what to do
Hubbard also has an approach for combining Fermi estimation with calibrated confidence intervals on subcomponents, and using Monte Carlo simulation to get a calibrated confidence interval for the main question. It would be cool to get that method down, but I haven’t used it.
Before you seek out information, identify what information would actually be useful—would this information change what I do? Figure out the value of information. VOI already was part of the LW idea library and the subject of a CFAR unit, but I suspect that How to Measure Anything has helped me internalize that question.