Interesting. How would you explain behavior like NASA management’s cooking of shuttle safety numbers before the Challenger explosion, then? Richard Feynman is always a good read. It seems clear that at some level bureaucrats have often tried to optimize the wrong things, in this case “percieved safety,” but it seems reasonable that lots of other wrong things will get optimized, including money.
It’s also a little problematic because the survey doesn’t measure quite what we want to measure. It could be a genuine effect, or it could be something like the difference between believed belief and actions in the real world, with effects not seen in the survey emerging in the workplace? Or maybe, due to the general perception it’s simple dishonesty—I looked up the source of the data, and it’s a face to face interview survey, which picks up more bias from that kind of thing.
Interesting. How would you explain behavior like NASA management’s cooking of shuttle safety numbers before the Challenger explosion, then? Richard Feynman is always a good read. It seems clear that at some level bureaucrats have often tried to optimize the wrong things, in this case “percieved safety,” but it seems reasonable that lots of other wrong things will get optimized, including money.
It’s also a little problematic because the survey doesn’t measure quite what we want to measure. It could be a genuine effect, or it could be something like the difference between believed belief and actions in the real world, with effects not seen in the survey emerging in the workplace? Or maybe, due to the general perception it’s simple dishonesty—I looked up the source of the data, and it’s a face to face interview survey, which picks up more bias from that kind of thing.